homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 63 message thread spans 3 pages: 63 ( [1] 2 3 > >     
For Those Hit Around September 4th. Recovery Story

 4:52 pm on Nov 5, 2013 (gmt 0)

One of my sites was hit on September 4th, I came here and poked around a little and noticed others had also been hit around the same time with something.

So I figured I would come back and post this just in-case someone else was hit around the same time, and maybe this could help.

My sites are always pretty clean, so the penalty threw me off guard a little. Here are the before and after stats.



The only thing I did to recover was I started in the HTML Improvements area of Webmaster Tools and started cleaning up the errors. The area I think that caused the initial penalty, as well as the recovery, was the 404 errors. Once the 404 errors reached 0, two days later the site recovered.

This is just my opinion though, as everyone knows, with Google, it could be a million things.



 5:16 pm on Nov 5, 2013 (gmt 0)

Interesting... I've heard a few reports like this now.

I redesigned a site 2 years ago and it tanked, never recovered. It coincided with a panda date but I can't be sure it was that.

Anyway it has 50.000 404 errors due to an old plugin. I recently 410'd them and have also started marking them as fixed in webmaster tools, 1,000 per day. It's slow but they are going down.

Did you do anything about the 404's other than the above?

Congrats on the recovery.


 5:25 pm on Nov 5, 2013 (gmt 0)

The site is a forum and I also had a misconfigured addon that cause a good percentage of errors. That was fixed programmicly.

Others where real 404's, and there were a lot of them over time, which I went through and 301'd to new threads that were related and better then the deleted versions.


 5:32 pm on Nov 5, 2013 (gmt 0)

Can you clarify fixed programmatically. Forum here too. Thanks.


 6:24 pm on Nov 5, 2013 (gmt 0)

Many sites were hit on 04 September, including mine. Possible, that was the date for Hummingbird update. I have many 404 errors for deleted pages from database. I have experienced that 404 or 410 is the same for Google, at least regarding the time to remove them from index.

Google, on their webmaster forum, says that 404 errors does not affect rankings, unless those pages are linked within site, but who knows.

Also, is possible that has been some sort of update and some scores has changed.

Anyone else recovered from 04 September?


 6:35 pm on Nov 5, 2013 (gmt 0)


What had happened was, the site was hacked. The attacker altered the vbseo file part that controls how links were created and displayed. So a lot of my pages went 404 as I didn't notice the hack until three weeks after it happened.

Of course by then, Google had re-scanned the site, and there were new links to the same old pages and the old pages were now 404.

So for those, the old url structure was brought back, and all the new urls that had been created due to the hack were 301'd back to their original place, and by programmatically I mean that this was done with the script, not 301'ing one by one.

The other half of the 404's were actually 404, the site has some age to it and I let the 404 errors slide until there were quite a few. So for those, I just manually 301'd to a related thread that was better then the one that was deleted or missing.


 7:36 pm on Nov 5, 2013 (gmt 0)

Google, on their webmaster forum, says that 404 errors does not affect rankings, unless those pages are linked within site, but who knows.

[Emphasis Added]
Of course by then, Google had re-scanned the site, and there were new links to the same old pages and the old pages were now 404.

Are you sure it was the 404s themselves or do you think it may have been "the exception" to the "rule" [links to 404 pages] you pointed out in a later post?


 7:49 pm on Nov 5, 2013 (gmt 0)

Honestly, not sure. Your guess would be as good as mine. All I know for sure is that after it tanked, I started fixing the 404 errors outlined in WMT. After the 404 errors were cleared, two days later the site recovered.

I can say though, that the way vbulletin works with the mode vbseo, is that there would be no links pointing to the 404's. In other words, they would have all been pointing to the new hacked url until I changed them back to what they were before.

Hope that helps/makes sense =)


 8:21 pm on Nov 5, 2013 (gmt 0)

Well, let's see -- It really boils down to two options:

1.) Google's lying when they say 404s don't hurt, unless you have links to them, which means somehow, some way there's a benefit to Google to not simply let webmasters know they should not have 404s errors on their site, otherwise there would be no reason for them to lie about them having an impact.

2.) You had legacy pages "go missing" for 3 weeks, possibly (through non-dynamic [in-post]) links to those pages, plus you lost all inbound link weight, trust, freshness and everything else that passes through links until you fixed the hack, the legacy pages were "found" by your server, and Google respidered the site.

Personally, I'll go with Option 2, because 1 makes no sense to me. Plus you lost every ranking factor that passes through inbound links, I have yet to see anyone *prove* it was the 404 errors that were actually the issue, and I've taken a site with 1000s of 404 errors reported in WMT from 1 page indexed and 0 traffic to 12,000+ pages ranking in a competitive niche without ever looking at a WMT report, fixing, or "clearing" 404 warnings in WMT.


 8:56 pm on Nov 5, 2013 (gmt 0)

That makes sense, it could be 2. Yet lets think about it a sec.

Lets say a site shows 80,000 index pages, and the WMT 404 report shows 50,000 404 errors. Would it not make sense to add a penalty to that site even if they weren't linked? I mean the user experience, according to the numbers, would be horrible. Maybe there is a certain % of total links indexed, divided by WMT 404's, that would trigger a penalty? I mean it would make sense. I have also been around long enough to know that Google does not always tell the truth, so there is that as well.

The timing also makes me wonder, as it was around the time Hummingbird came along. Could Hummingbird be taking into consideration more information from WMT then the algo before it?

Just tossing out ideas, not trying to tell you your assumptions are incorrect by the way, as I have no idea really.


 9:16 pm on Nov 5, 2013 (gmt 0)

Well I 301'd all 50k 404's. Will report back ;)


 9:19 pm on Nov 5, 2013 (gmt 0)


Yea, but what to, that may be an important element lol


 9:25 pm on Nov 5, 2013 (gmt 0)

It was an issue whereby forum topics had query strings added to them creating duplicates, I 404'd them originally. Now I have 301'd them to the individual topic clean URL, so not to root or anything like that.


 9:29 pm on Nov 5, 2013 (gmt 0)

Would it not make sense to add a penalty to that site even if they weren't linked? -- I mean the user experience, according to the numbers, would be horrible. Maybe there is a certain % of total links indexed, divided by WMT 404's, that would trigger a penalty?

They wouldn't need to "penalize", because the site would lose every bit of every ranking factor that passes through any inbound link to any of those page *and* from all of those pages to the other pages still on the site since 404s don't "pass weight" anywhere.

The loss of ranking factors passed would "effectively behave like a penalty", but they don't need to "punish" the site with the 404s more than to simply not pass the ranking factors it had through inbound links and not show the 404 error pages from it in the index.

Here's another way to look at it: If I put up 500,000 links all to different pages that didn't exist on *your site* would that somehow cause a bad visitor experience for people who found a page from your site in Google and then visited it? Not at all.

My linking to 404 error pages on *your* site [or any site] would cause a bad visitor experience relating to *my* site, because I didn't maintain the links and make sure they ended with what the visitor thought they were getting. If *my* site were removed from the rankings, that "bad visitor experience" wouldn't happen. Conversely, removing the available pages [200 OK] on your site from the rankings wouldn't solve that or any other problem related to the 404 errors on your site at all.

What would cause a bad visitor experience on *your site* is if the links to the 404 pages were on *your site* rather than mine. In that case, as Google, to make sure you send your visitors to a "good experience/destination" removing *your site* makes sense.

Which leads to [rhetorically]: Why would Google say 404 errors are not a problem unless there are links to those pages on your site?

Could Hummingbird be taking into consideration more information from WMT then the algo before it?

You're "going the wrong way" with the direction of information here -- The algo doesn't take information like 404 errors or pages containing links to your site and things along those lines *from* WMT, the algo provides the information *to* WMT, which is why WMT if often out of date in the info it show webmasters about a site.


 9:42 pm on Nov 5, 2013 (gmt 0)

Well what I ment to say was

Could the new algo be taking into consideration more of the information that is shown through WMT, not that WMT actually created the information. Yet I see your point, it makes sense.


 9:42 pm on Nov 5, 2013 (gmt 0)


ok, that makes sense =)


 9:54 pm on Nov 5, 2013 (gmt 0)


I was hit by the 4th sep update and lost 75% organic traffic and i had the same recovery as you BUT it only lasted for 7 days. Now i am back with 75% losses.

So dont smile too fast!


 11:43 pm on Nov 5, 2013 (gmt 0)

The way I understood what Google says on 404 is:

1) If someone links to your site to a non-existing page (404 returned by your site), then this should not affect your ranking.

2) If within your site you link to a non-existing page on your site, then this could be a problem if it is done to a certain magnitude (certain = unknown) and could adversely affect your site's ranking.

Whilst both are bad user experience, the second one is in your control and if not fixed, it may indicate to Google a low technical quality of the site.

I have gone through quite a few site redevelopments where we let lots of old URLs return 404/410 instead of redirecting and it has never had an adverse affect.

I haven't however tried to link internally to own non-existing URLs that would return 404 to verify point 2 (the odd one 404 that may temporary exist on the site as a result of some kind of linking error and subsequently fixed never resulted in ranking issue)


 1:53 pm on Nov 6, 2013 (gmt 0)


Great :/ I'll post back here if it tanks again.


 6:35 pm on Nov 10, 2013 (gmt 0)

I was also hit on September 4th with a 75% drop in traffic. The web site has never had a purchased link and never had any blackhat SEO.

On the day of the drop I had 240 crawl errors (I didn't think much of them as they are legacy pages). I currently have 8,000 (because I removed a lot of pages that I thought Google could be seeing as spam, but now in hindsight they were unique original pages, but I'm sticking to what the web site does best now. These are returning the 410 gone header.

I have tried everything since to try and get some traffic back, but it hasn't happened. I have already started to work though the HTML improvements in WMT and they are gradually decreasing. But the crawl errors is something I have overlooked. I have started working on them. I have started on the 301 redirects from the legacy pages to the new urls where the equivalent new page exists.

Maybe the original 240 crawl errors caused the penalty...

Thanks for the advice. I will report back also. Fingers crossed this is it!


 9:51 am on Nov 21, 2013 (gmt 0)

Just a quick update...

I've been religiously fixing crawl errors every day this week and reduced them down to zero (started at over 8000). I have not seen any change in traffic but GWT crawl errors hasn't updated in 2 days since I cleared the errors! The graph is showing 250 Not Found errors but not listing any to fix. Last update was on 19th November.

Will keep you updated...


 8:53 pm on Nov 21, 2013 (gmt 0)

Same here on data update dates. Stuck on 19th.


 6:22 pm on Nov 22, 2013 (gmt 0)

GWT crawl errors has updated for me today... another 41 errors which have been fixed where required and cleared (although the "Linked from" tab was empty for the majority of them). Maybe the traffic will start to increase. Will let you know.


 10:23 pm on Nov 22, 2013 (gmt 0)

Yep. Error updates are back. Will have to wait and see if it is up to cage!


 11:59 am on Nov 28, 2013 (gmt 0)

Our site dropped by 90%. Manual action due to spammy links. We successfully recovered 100% of our traffic after a month (manual action revoked)


 12:11 pm on Nov 28, 2013 (gmt 0)

Well done, AlmostHuman. Do you mind sharing what you did to recover? Did you use Disavow tool and/or you contacted sites linking to you? And anything else?

Spammy links you had - were these old historical links or new links or proportionally collected over the time?


 4:41 pm on Nov 28, 2013 (gmt 0)

One week on and I'm seeing no change at all.

Changes over the last week...

Crawl errors are down and when they do show up they are to pages that are returning the 410 gone header, and no links exist to them any more anywhere on the site or the internet, or URLs are getting shortened and are causing 404 errors (don't know what to do with this, there is no linked data and the URLs don't exist anywhere on the site)

HTML improvements are down from around 400 to 150 and the 150 will be cleared as and when Google get round to them.

Also developed a couple of scripts for checking for thin content and removed dynamic pages that were empty.

Added the rel="next" and rel="prev" to paginated content to hopefully bring down the HTML improvements.

No followed links in archived news pages which were linking to 404 on external web sites.

Added a couple of canonical tags.

I will update if I see any increase in traffic but I'm not expecting it.

[edited by: Robert_Charlton at 2:13 am (utc) on Nov 29, 2013]
[edit reason] removed specific, per forum Charter [/edit]


 1:29 pm on Nov 29, 2013 (gmt 0)

Google handed me 9000+ 404 errors in my log, for pages that have either never existed or haven't existed in years. Each of the errors had been previously reported and cleared years ago too. So why tell me about them yet again(no, they aren't linked from anywhere on my sites)? And penalize me? Now that wouldn't be cool.


 2:05 pm on Dec 5, 2013 (gmt 0)

Well, the new Smartphone errors tab in GWT has given me 2356 errors to fix (but they are the same errors that were appearing in the normal URL errors before). I'm now receiving around 15 error messages a day.

HTML improvements are down to 97. Most of these are the usual paginated content problems... the new rel next and prev tags don't seem to be making a difference to bring them down.

Still no real change in traffic but I have been seeing an extra 1,000 to 2,000 page views a day. Which is probably due to the new breadcrumb links I've recently implemented.


 10:21 pm on Jan 8, 2014 (gmt 0)

Down to 0 crawl errors and 23 HTML improvements. Still no change in traffic.

Have any of you guys had any luck?

This 63 message thread spans 3 pages: 63 ( [1] 2 3 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved