Forum Moderators: open
mydomain.com/index.php?catid=xx&subcatid=xx&adid=xx
a killer for Google?
So far I managed to get
mydomain.com/index.php?catid=xx&subcatid=xx
in, but no url with three variables.
Did anybody ever get 3 in? Getting these URL's indexed would mean an additional 600 Pages indexed with good content. They are ads, but long term ones, that stay in for about 30 days.
How does Google handle Dynamic PHP Pages?
[webmasterworld.com...]
My client went through all kinds of problems with a php programmer setting up a dynamic php shopping cart. Check out the forum and you'll notice that Tartan75 and Jatar_K seem to have some good knowledge on the topic.
mydomain.com/index.php?catid=xx&subcatid=xx&adid=xx
to something like
mydomain.com/a-folder/xx-xx-xx/
or
mydomain.com/a-folder/xx-xx-xx.htm etc
Since you say the ones with three variables are not indexed, the ones with less than 3 vars (that are maybe in the index) shouldnt be affected. I wouldnt put the names of the variables in the filename because you already know what they are.
lots of .htaccess and mod_rewrite wizards in the forum mentioned ;)
Bag-O-Tricks for PHP II - some code snippets that should be helpful for all in creating dynamic sites - Getting rid of those query strings [webmasterworld.com]
Andreas
Has this been your experience, or were they simply incompetent?
andreas will be able to elobarte i hope ;)...but .htaccess and mod_rewrite are "high up in the chain" of events in the process of grabbing a page from your site. so basically its meant to be very fast.
I guess if the regex is complicated or you get a fair sum of hits it could slow down. If G bot or other bots are hitting your server too hard to you can email them to slow it down.....or if its someone else...ban them?
They are popular solutions for good reason :)
Using mod_rewrite [httpd.apache.org] in directory context will be slow and rather inefficient since it will cause an internal subrequest for each rule it has to process. Having said that it will still be fast enough for anything but really big sites. If you have root access put your RewriteRule [httpd.apache.org]s into your httpd.conf file. If you want to develope a SE friendly application right from the beginning by all means do so. It will be fast, it will be nice, it will be everything you make it to be.
The thing just is that quite often your page, your apps all exist and then you realize that Google wonīt index your site because of some shortcoming in your app. Now you need to decide how much time you can spend on rectifying this situation. Quite often a not quite optimal but still pretty fast solution does exist. Itīs all a tradeoff between developement time, actual speed, money to spend on hosting, etc.
Having said that I have not expierienced a significant slow down even using mod_rewrite [httpd.apache.org] in directory context. I posted the results of some benchmarks I did a while back in the monster thread A Close to perfect .htaccess ban list [webmasterworld.com]. While this was admittedly a rather different situation (lots of RewriteCond [httpd.apache.org] rules instead of RewriteRule [httpd.apache.org]s) it just gives you an idea on the time that processing of mod_rewrite [httpd.apache.org] rules takes. Keep in mind that this benchmark was done on a somewhat aged machine as bird so nicely put it and that your mileage may vary depending on the number of requests served. Of course if that is really a matter you might want to choose a different approach (m o d _ p e r l - needed the spaces since my stupid autolinking proxy choked on it. Itīs family, itīs Perl [perl.com] you stupid proxy. Need to fix that ASAP :)) altogether.
Andreas