flicky - 5:32 pm on Jun 27, 2010 (gmt 0)
Thanks for responding...
I would call this overhaul googlebot "shock and awe". My site isn't crazy large, maybe 2000 pages. But ALL of them were changed on the back-end.
I basically stripped the pages down completely and created a CSS-only structure. The content is exactly the same as before.
My site is a good barometer for how google handles things like this as it's always been a trusted source with a very strong backlink profile going back over 10 years. I'm top 10 for 100s of very competitive keywords.
Everything remains the same as far as directory structure, titles, meta.
My sense is that they temporarily remove a massively changed page from the index just in case it was hacked. After a period of time, when they see that the changes are sticking, they will re-spider.
At least I hope that's the process.
Google is on a rampage throwing "page speed" down everyone's throats - and for good reason - I completely agree with everything they are saying. However, when a webmaster actually follows through on these suggestions, they shouldn't be penalized. All I've done is improve things for both my surfers and googlebot's discovery of my site.
Anyone done something like this recently that can shed light on what I might expect in the coming days?