Welcome to WebmasterWorld Guest from 188.8.131.52
If they don't see the second page as a different page does this suggest that php pages are not all that SEO friendly? I was always of the general opinion that static .html pages are always best because of unknowns like this.
Thanks for any advice here guys.
[edited by: tedster at 7:25 pm (utc) on Jan. 23, 2006]
[edit reason] use example.com [/edit]
However, they have problems with having more than 3 parameters in a URL, and anything that merely looks like it might be a session ID.
Additionally, you can cause yourself a lot of trouble if you ever change the parameter order on some your links: buy.a.shirt.php?size=large&colour=blue is duplicate content of buy.a.shirt.php?colour=blue&size=large for example. Make sure that all links use the same ordering, and the exact same parameters.
If there are other versions of pages (perhaps an extra paramter &pf=1 for "print friendly" pages, then make sure that all of those are served with a <meta name="robots" content="noindex"> tag to keep them completely out of the index.
I think just as important, and often never asked, is the challenge of not not making careless mistakes with db driven links. Be consistent in your parameters.
are two different pages in the eyes of SEs.
So when you go down the road of db driven content. Be consistent so that you don't have goof ups that haunt you.. such as the popular 'duplicate content' isssues we all read and talk about it.
So they WILL do it, but if you don't have PR up front, they may not go that deep. My site in google fluctuates from 50,000 to 80,000 pages - PR5 site that's been around a long time. The vast majority of my indexed pages have .php?something=something
Keep it simple and use as few parameters as you can. Personally I would go the static/translate route if I wasn't scared to death of mod_rewrite.
If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.
I think the recent experience is this does not cause a problem for Googlebot.
Unexpected results with mod-rewrite include: Site disappears... site disappears from google index, and other nasty stuff. And I don't know it that well, so it's easy for me to put off converting my pages so long as googlebot keeps sucking up my question-marked urls.