If you haven't already done so, take a look at the forum:
How does Google handle Dynamic PHP Pages?
My client went through all kinds of problems with a php programmer setting up a dynamic php shopping cart. Check out the forum and you'll notice that Tartan75 and Jatar_K seem to have some good knowledge on the topic.
There was another topic the other day (a few weeks back, I can't find it though :-/ ), where the general consensus was that 2 works and 3 is too much (also query-variables with 'id' are bad).
I can confirm that with the experience from my sites.
Thanks, I think I will have to try to reduce the variables down to max. two.
The problem just is, that it is a script that has regular updates and I didn't want to modify to much, as it is always a lot harder to implement the updates.
|brotherhood of LAN|
Try mapping all your pages with three variables to static URL's with .htaccess
to something like
Since you say the ones with three variables are not indexed, the ones with less than 3 vars (that are maybe in the index) shouldnt be affected. I wouldnt put the names of the variables in the filename because you already know what they are.
lots of .htaccess and mod_rewrite wizards in the forum mentioned ;)
Hereīs a quick and easy way to achieve what bol suggested:
Bag-O-Tricks for PHP II - some code snippets that should be helpful for all in creating dynamic sites - Getting rid of those query strings [webmasterworld.com]
shhh.... I think I've just fallen in love with a forum :-)
Brotherhood of LAN... I have one of my clients that tried the subdirectory route (after finding out that "http://www.clientfirm.com/store/index.php?action=item&substart=0&id=73" didn't work with Google), and they said that it served the pages much slower.
Has this been your experience, or were they simply incompetent?
|brotherhood of LAN|
andreas will be able to elobarte i hope ;)...but .htaccess and mod_rewrite are "high up in the chain" of events in the process of grabbing a page from your site. so basically its meant to be very fast.
I guess if the regex is complicated or you get a fair sum of hits it could slow down. If G bot or other bots are hitting your server too hard to you can email them to slow it down.....or if its someone else...ban them?
They are popular solutions for good reason :)
The code that I linked to will be rather slow indeed. Thatīs just the price you have to pay for the benefit of not having to change anything in your script except for adding the five lines of PHP [php.net] code to the top of your script.
Using mod_rewrite [httpd.apache.org] in directory context will be slow and rather inefficient since it will cause an internal subrequest for each rule it has to process. Having said that it will still be fast enough for anything but really big sites. If you have root access put your RewriteRule [httpd.apache.org]s into your httpd.conf file. If you want to develope a SE friendly application right from the beginning by all means do so. It will be fast, it will be nice, it will be everything you make it to be.
The thing just is that quite often your page, your apps all exist and then you realize that Google wonīt index your site because of some shortcoming in your app. Now you need to decide how much time you can spend on rectifying this situation. Quite often a not quite optimal but still pretty fast solution does exist. Itīs all a tradeoff between developement time, actual speed, money to spend on hosting, etc.
Having said that I have not expierienced a significant slow down even using mod_rewrite [httpd.apache.org] in directory context. I posted the results of some benchmarks I did a while back in the monster thread A Close to perfect .htaccess ban list [webmasterworld.com]. While this was admittedly a rather different situation (lots of RewriteCond [httpd.apache.org] rules instead of RewriteRule [httpd.apache.org]s) it just gives you an idea on the time that processing of mod_rewrite [httpd.apache.org] rules takes. Keep in mind that this benchmark was done on a somewhat aged machine as bird so nicely put it and that your mileage may vary depending on the number of requests served. Of course if that is really a matter you might want to choose a different approach (m o d _ p e r l - needed the spaces since my stupid autolinking proxy choked on it. Itīs family, itīs Perl [perl.com] you stupid proxy. Need to fix that ASAP :)) altogether.