Forum Moderators: Robert Charlton & goodroi
A site: search shows that (because the designers have not been consistent in the internal navigation) Google has indexed some pages twice with different urls - not just /pages and /Pages, but also contactus.aspx and ContactUs.aspx
I tested this on sites with internal PR and have found that the toolbar PR changes if you play around with the cases in file or page names.
This, together with the results I got using site: leads me to believe that this is an opportunity to hurt a competitor by linking to non-existent versions of their pages.
If I reckon it correctly (and matchs isn't my strong point) then you multiply the number of characters in the url by itself to give the possible number of duplicates. So pages/home has 81 combinations. The bigger the number of pages with long file names, the more dupes you can make. Enough to get some dumped for duplicate content? Or will Google ignore them?
Or do they have a fix for this? I did a bit of searching but couldn't find anything about this from Google or Microsoft.