Forum Moderators: phranque

Message Too Old, No Replies

Creating a low server load website with SE friendly URI.

dynamic driven website - most efficient way?

         

scraptoft

4:16 pm on Oct 30, 2006 (gmt 0)

10+ Year Member



Hi guys,

After weeks of planning, doodling and using approximately 2 notepads worth of paper I am ready to create my website. I have bundles of prewritten content and I am very eager to get online.

I want the website to be as close to perfection from the word go (don't we all) it is staying on my local server safely away from SE's until ready for their crawls.

In a previous website I used mod_rewrite:

www.domain.com/page.php?topic=cheese

www.domain.com/cheese/page

This website generates alot of server load which is quite worrying. As I want to get it right from the start, I don't want to be dealing with server load problems months down the line - which could force me to change the urls of each pages?

I hope to be recieving 50,000 page views per day within the 6 months/year, do you think php and mysql dynamic pages could handle this? I will be using a shared hosting account again.

Do you have a website that has this amount of views and survives?(shared or dedicated)

If not what are my alternatives?

I have been reading up on cron jobs which convert dynamic to static pages, but wouldn't this mean a huge amount of .html pages and could I still keep the same url structure?

Thanks in advance, I hope I have made myself clear, ask away if I havn't.

jd01

2:31 am on Oct 31, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



1. If you are experiencing server load issues due to .htaccess / rewrites I recommend using Skip Flags and regular expressions. Depending on your naming convention / rule order you can lighten your server processing quite a bit.

2. Yes, I have put over 85,000 page views on a site (in a shared hosting environment) which is .htaccess, mysql and php driven without server load issues.

3. You can export everything to .html if you like, but then you are guaranteed of (pages X update frequency + normal traffic load) on your server unless you export/save on another box. I would put more emphasis on creative use of caching, indexing and efficient selection strings to keep things fast.

Hope this helps.
Justin

scraptoft

6:10 pm on Nov 2, 2006 (gmt 0)

10+ Year Member



Thank you for the helpful information Justin - The exact summary I wanted to hear.

I didn't particularly like the idea of having to generate tons of .html pages especially with the update frequency my website will require.

You have given me the confidance to go with the path I most wanted - dynamicly driven.

I guess this means lots more reading on caching, indexing and making my sql commands more efficient.

Cheers

jd01

7:56 pm on Nov 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Glad I could help.

I think the biggest thing to remember is EVERYTHING (htaccess, php, MySql, selections, index keys/caching, html, etc.) has to work together in the environment you are wanting to use.

Make sure you have a plan... you are only as fast as your slowest process + all other processes. For me there is usually quite a bit of testing, timing and reworking involved to keep things fast.

Having a good host is also essential.

Justin