Forum Moderators: Robert Charlton & goodroi
Is anyone, experiencing a significant drop in pages indexed ?
Or is this discussion purely about serps and not quantity of index pages.
[edited by: tedster at 8:06 pm (utc) on Nov. 26, 2008]
Initially I'm 301 redirecting only a subset of the site's URLs, and implementing URL rewrites for the rest so that when someone requests the old URL the content of the new page is displayed but the URL in the browser is the old URL. Over time I will switch the rewrites to 301s folder by folder by changing all occurences of [L] to [R=301,L] on the RewriteRules in the .htaccess files.
Now I read all of this. LOL Getting a little nervous although I'm not sure I would have done much differently. I'll let you know in a few months how it went.
But there is one connection I feel pretty sure of. The number of sites is an estitmated number, not a true count. Even more, it's a rather difficult estimate for Google to make accurately, because of the way that they "shard" or break up their data into many, many bits stored in many many places.
When Google makes significant changes in their infrastructure, as they are apparently doing this month, then the established methods for estimating can be interfered with and need to be revised, but that is a lower priority than first bringing the new infrastructure changes into production. Only then can the "estmated number" functions be tweaked to come back in line - that also includes the site: operator numbers, especially for larger sites.
I don't like the sound of the rewrite if that means that content can be accessed at both the old and the new URL, with both returning "200 OK" for all requests.
The purpose of the 301 redirect is to make sure Google finds the new URL. They will drop the old URL in their own time once they see that it redirects.
If you have gone extensionless, then the URLs should not end in a trailing slash, as that breaks the HTTP specification. URLs ending in a slash are "reserved" to be URLs that point to real folders.
Ted, this sounds as if you work at the Plex
Funny! I clearly don't work there, but I do watch the big G rather closely and over the years I've read a lot of patents (see the bottom of our Hot Topics area [webmasterworld.com]) and other technical documents, such as the Google File System [labs.google.com] paper.
With that very modest and partial information as a base, I then think I see certain hints in what goes on visibly with Google, just as you can look at certain websites and be pretty sure that they use Wordpress, for instance.
I think we can put the 301 issue away now. Executed in a technically correct manner, a 301 redirect has purpose and value, and it really can help Google do a better job for your website. Just don't get casual about it. This is serious server technology we're talking about here, not a highly fault tolerant mark-up language such as HTML.
[edited by: tedster at 5:13 am (utc) on Nov. 27, 2008]
The 404 page will have a sitemap or quick links anyway so the client isnt likely to lose any visits that do drift in by way of the old links.
Plus,even if they get a 404 the visitor will be coming to a fab new; modern site and not some old 1990's design that sends them heading straight to the back button.
[edited by: matWright at 7:06 am (utc) on Nov. 27, 2008]
...non-informative title tag. The title tag was for my contact page and it was simply "Contact" so Google took it upon themselves to change the title that appeared in the search results to a snippet of 58 characters from my Meta description.
Google has often changed titles or descriptions to include vocabulary contained in a query, but I've never heard of them doing it for a plain vanilla site:domain query.
A couple of reports of Google title changes that I found with site search...
Google dynamically changing my title, Wow!
[webmasterworld.com...]
Google changing Title in SERPs for certain keyword phrases
[webmasterworld.com...]
< continued here: [webmasterworld.com...] >
[edited by: tedster at 3:25 am (utc) on Dec. 3, 2008]