The old reports of the results of "shocking" Googlebot indicate a recovery time ranging over five or six days up to nine months, with the time depending on how often your pages are spidered, previous keyword and 'trust' rankings, inbound link strength and diversity, and many other factors.
I haven't done any "mass changes" in several years now, but I suspect that the longer recovery times have been reduced -- mainly because of a much smaller number of 'disaster reports' being posted here at WebmasterWorld.
How big of a shock was this? -- How many pages are we talking about here?
You mentioned that you did not change the HTML page headers. Did the page content change? Did the linking structure change at all?
What do you plan to do if recovery takes ten days? - a month? - several months? -- I don't normally recommend 'backing out' big changes but for the sake of discussion, how reversible are your changes? How separable are these changes -- e.g. the CSS vs. compression aspects?
Thanks for responding...
I would call this overhaul googlebot "shock and awe". My site isn't crazy large, maybe 2000 pages. But ALL of them were changed on the back-end.
I basically stripped the pages down completely and created a CSS-only structure. The content is exactly the same as before.
My site is a good barometer for how google handles things like this as it's always been a trusted source with a very strong backlink profile going back over 10 years. I'm top 10 for 100s of very competitive keywords.
Everything remains the same as far as directory structure, titles, meta.
My sense is that they temporarily remove a massively changed page from the index just in case it was hacked. After a period of time, when they see that the changes are sticking, they will re-spider.
At least I hope that's the process.
Google is on a rampage throwing "page speed" down everyone's throats - and for good reason - I completely agree with everything they are saying. However, when a webmaster actually follows through on these suggestions, they shouldn't be penalized. All I've done is improve things for both my surfers and googlebot's discovery of my site.
Anyone done something like this recently that can shed light on what I might expect in the coming days?
|Google is on a rampage throwing "page speed" down everyone's throats |
Yes, but page speed factors are more of an awareness campaign than an actual rankings shake-up, at least for now. The rankings impact of page speed is not at all major, and more like a final "tie breaker" when two URLs are both about the same going from other factors. Relevance and backlinks are still the keys.
It sounds like your site-wide change triggered an automated protection process, and I'd expect that filter to be cleared rather quickly since all your content is actually unchanged.
Yes, I agree that page speed isn't a major ranking factor "yet", but I was in desperate need of a code overhaul and this has resulted in a nice increase in "time on site" and "pages per visit" as well as lowering my bounce rate. As I hoped it would. Google just helped to remind me to get off my butt.
Tedster, your assumption is mine as well... when/if things come back to normal I will report here on the length of time it took so that others can form a strategy for "upgrading" their code in the future.
Ok, This Is Scaring Me... A LOT!
We've already been hammered by the Mayday Update. Sales are down 50%...
And now to be PCI compliant, we HAVE TO upgrade our site from the previous e-commerce software to the latest version.
We intend to keep the same exact on page content, H!, links, etc., but the old software is table based and the new one is CSS based.
Should I just give up on running an ecommerce site and instead just make a bunch of spammy made for adsense sites? they seem to be doing quite well nowadays...
"similar link" section converted from an unordered list into a regular sentence = -50 penalty.
read more about keyword and keyword is just as good for visitors, Google is choking on the change.
Solution? None, Google looks pretty silly but the site is fine for visitors.
Update... no change in bing serps since releasing this update... google still trashed.
Change made on Thursday June 24.
Update: One week later and none of my keywords have shown up. I'm still lost in googlespace.
Am I the only one that finds this absurd?
I take a bloated pre-2000 html coded site, strip it down, apply STRICT doctype and redo using pure CSS. Keep the content the same, roll it out and get removed by google?
Come on, Google! You want people to speed up their sites, but you penalize them for doing so?
I am sorry that I don't have any suggestions either. I am also worried because I might be in the same boat as you (need to change the markup since we have to get PCI compliant).
Do you have any traffic statistics that you can post here pre- and post-your markup update? What does the google referred traffic look like for your top 10 pages before and after the change?
And can I ask how you are tracking your major keywords?
and are those keywords primarily one-, two-, or three or more words long?
In my experience, changing a site from bloated code to good CSS shouldn't result in even a hiccup... as long as everything else is the same.
I've seen sites, though, where developers... really good developers... have done things like leave a meta robots noindex tag on the development template after the site's gone live, thus completely blocking Googlebot. Mistakes do happen.
Can Google spider your site, and are you getting indexed? Have you done a search for an exact text string? Etc etc. At this point, it may be worth rechecking some very basic stuff.
Another option, of course, could be that Google has goofed up. To speculate about possibilities... maybe Google was getting gamed via the new instant index feature of Caffeine and implemented some sort of quick fix that's kept you out too long. I can't believe, though, that recoding with no content, filename, or navigation changes would be something that they'd want to discourage.
Robert... thanks for the reply...
I am still in there when doing simple queries like "keyword domain.com" ... I'm just no where to be found for "keyword" as I was always top 10.
The cache date in google for the pages is within the last few days... so it's still updating.
Googlebot is going nuts on my site as usual (albeit, it took a couple days of vacation as others have reported)
Webmaster tools is stuck on June 24th for both "site performance" and "crawl errors".
Analytics shows a massive drop-off a day after the site code was refreshed... much like you would see from a penalty... however, this was a bit gradual over 24 hours as opposed to instant. Probably as googlebot discovered my pages.
I keep thinking I'll just hold out and wait... I'm certainly not reverting back to garbage code... I've reduced my page size for 75% and the site is faster than any of my competitors... much faster. It uses perfect strict standards... results in few validation errors and is pure CSS... the best you can get in my opinion!
No, nothing crazy like nofollow on the pages.
I hope you don't mind if I re-ask a question, since I think this can help out others as well:
"and are those keywords primarily one-, two-, or three or more words long?"
It might not be relevant, but with everyone talking about "long-tail" changes in the algorithim, it might be relevant.
I just have to take a moment and add that I feel the same thing happened to me. I actually made improvements to the coding on pages, and removed duplicative information sitewide. The result has been a loss of rankings. I'm almost certain there is some kind of "protective mechanism" that sandboxes a site that makes major changes. The changes I made have been mostly to improve site speed, and remove unnecessary links, which make for a much more organized and speedier user experience. That said, I have been able to improve sales even as google traffic declined. I do feel that in the end I will get the rankings back, but it is a little frustrating when you make improvements right out of the webmaster guidelines and lose rankings.
About a year ago I decided it was finally time for me to move from static html to CSS and using a database. I created what I thought was a really wonderful new hover navigation bar, way easier to find stuff. I was still messing around with it when I happened to read on this forum a thread where the OP had done precisely what I was doing--taken a static html site and switched to CSS and database-driven and used a hover navigation menu--and their site had dopped like a rock out of Google. One poster said something about it could depend on which side of the page the navigation was located on, that the right-hand side was less problematic. I don't know. I myself decided that static html was just fine.
The keywords that I track are mostly 1 and 2 words... Very competitive in my niche.
HRoth: interestingly enough, I moved my menu to the left side during this overhaul.
Over the years I've gathered a bunch of wisdom from the experts at this forum, and so a long time ago I came to the conclusion that to keep the Googlebeast happy, I should only make small incremental changes, not massive re-designs. I still live by that rule, and the OP in this thread convinces me yet again that it remains the way to go.
|I moved my menu to the left side during this overhaul. |
That might be an important clue.
Did the rearrangements on the visible page affect how the content, navigation etc. are ordered in your source code?
[edited by: buckworks at 4:46 am (utc) on Jul 2, 2010]
Seen this happen before thanks to all the CSS nuts pushing everyone to upgrade.
Bottom line, if it ain't broke, don't fix it.
Yes, the menu DID rearrange the structural order of my content.
As of today I still see no change.
This should scare off anyone wanting to make major "improvements" to their site. Just so you know... this site is VERY well known. Got a ton of traffic from google and has an extremely solid backlink profile.
Go down the list that google recommends... I did it all:
Content is exactly the same. I only got rid of the bloat... I mean doesn't this look prettier to googlebot?
<table cellpadding=4 cellspacing=4 border=0 style="blah blah blah"><tr><td><font face=arial size=2><b><i>luv this</i></b></font></td></tr></table>
<span id="yes">luv this</span>
I would predict that, in time, you will return to your previous positions. The problem is the waiting game, and the uncertainty that you must be experiencing ... and the financial hit can't be any fun either. During the MayDay slaughter one of my best sites dropped like a stone in a matter of 2 days. I went camping in a remote area of southern Idaho and when I re-emerged into civilization, I found that nearly one month to the day after The Fall, that same site had once again regained all the previous traffic levels. If a similar pattern holds for you, then you'll be back on top again in a matter of weeks. In the meantime, my suggestion is to not do anything else to upset The Beast!
I used to have Month and Year in <h2> tag on all the pages. Updating it previously did not affect rankings. This time I've changed it to July and all specific product pages are stopped ranking in google as soon as Googlebot re-crawled the update.. Almost instantly -50% traffic. So there must be some kind of protective mechanism in place now
I've done the same thing in the past and disappeared from the rankings. I pop the old html site back in and it shows back where it was in a couple of days. why this is I have no clue, but I now only make small changes to any of my sites and monitor what happens to rank.
Stas - I've noticed even the simplest and harmless changes on our 14 year old site affect the rankings for pages. I like Lovejoy make very VERY small changes and have been slowly rolling out css changes.
Bewenched and others that fear making anything but small changes...
Do we reach a point where we are changing our site for the search engine rather than the user? When we have ways to improve the user experience, but it has repercussions with the search engine, isn't there something not quite right about that?
|Yes, the menu DID rearrange the structural order of my content. |
I suspect that detail is significant.
How many navigation links does a spider now have to wade through before it would reach the unique core content of the page? How does that compare with the previous version?
Something has clearly changed the SEO balance here and if the menu is large, and coming earlier in the source code than it used to, it would be my prime suspect.
Some tweaks to your CSS positioning might be worth testing here, to keep the same visual appearance for users but present the menu later in the source code.
|When we have ways to improve the user experience, but it has repercussions with the search engine, isn't there something not quite right about that? |
I'd say it is fundamentally wrong, but I also know that the best designed site is doomed to failure without consistently adequate visitor levels. No one will be happier than me when if/when the day comes when we have 3 or 4 approximately equal search engines, along with other dependable traffic sources. I hope that day is sooner rather than later, but today when I look out on the horizon, I mostly see the GoogleBeast roaming the landscape, so I tread lightly as to not attract his swift & terrible wrath.
It's now been 2 weeks and everything is still gone.
I can understand a short period of removal while things get re-spidered, but how can this happen to an authority site that did nothing but clean up code and make things better for its surfers and for googlebot?
My site performance graph in webmaster tools is looking great... it's showing a MUCH faster site than before as I expected.
Please google, don't penalize a site for following YOUR suggestions for increase PAGE SPEED!
Thank you very much.
I am not a software person. I created all my web pages with an ancient page editor (homepage 1997). I am STILL using it, and I don't really want to change. I know its every single weakness, and don't want or need any fancy stuff.
But I have become very concerned with falling behind. So I started a crash course of learning CSS. The learning curve is a bit steep, for a non-programmer like myself.
And now, THIS! Ha-ha. Ha.
My implementation is now going to be very much slower than planned.
If it ain't broke, don't fix it! Amen. I have "fixed" things before specifically for Google, which turned out poorly. Sometimes "not trying too hard" can be a wise course of action, where Google is involved. The "wise course of action", is to come here frequently, and read like crazy. Yes, sir. That's the way to go.
I think buckworks is onto something here, and the question deserves an answer....
|How many navigation links does a spider now have to wade through before it would reach the unique core content of the page? How does that compare with the previous version? |
Also... apart from positioning... has there been a change in the total number of nav links on the page, particularly on your home page?
| This 110 message thread spans 4 pages: 110 (  2 3 4 ) > > |