Welcome to WebmasterWorld Guest from 22.214.171.124
I've been aching for a long time to mention somewhere official that sites shouldn't use "&id=" as a parameter if they want maximal Googlebot crawlage, for example.
My site uses the &ID= parameter in the URL string, along with a secondary parameter.
I don't have any problems with Googlebot spidering my site (daily) or indexing any of it's pages (over 20,000 pages indexed), and I rank fairly highly for several keywords.
However, I've been thinking of swapping around the parameters to make the ID parameter secondary (and also changing the parameter name), i.e.
I would like to give this a try to see if it helps in any way in the SERPs, however, I suspect that changing the parameters like this will have an adverse affect, in that Google may think they are brand new pages.
Does anyone have any experience of dynamic database driven sites that have had their URL parameters changed, and if so, can you offer any guidance?
Thanks in advance.