Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

Recovering from Traffic Drop - It's Not Panda, Penguin, PLA



11:31 am on Oct 23, 2012 (gmt 0)

So, we've had this sudden traffic drop on 4-5 September and we are now at 50% of our regular traffic. I've been researching about the causes of the issue and unfortunately, I'm totally stuck, with no clear way to deal with recovery. I'd therefore appreciate your feedback and suggestions.


1. I removed large number of unwanted tags from our site in August. All these tags resulted into 404 errors.

2. Sometime around beginning of September, the Disqus plugin on our site resulted into invalid URLs being formed. So our 404 error count, as reported by Google Webmaster Tools rapidly went from 1k errors to 17k to 44k to 70k to 99k.

4. On 5th of September GWT reported sudden rise in Page Not Found Errors. And on 4th of September we had began noticing that our traffic wasn't what it should be.

I concluded that it's the large number of errors causing Google to drop our site in rankings and fixed those errors around October 15. Now almost all the errors causing 404 errors have been fixed.

Google continues to say that 404 errors don't affect website rankings and I've made sure that nothing on our website would really get affected by Panda, Penguin or PLA algorithm (there was no algorithm update on 4-5 September). Now the low value pages are outranking us for several of the keywords that were #1 for several months (years). We only followed basic SEO rules; never over-optimised or did any black-hat SEO.

Since 15 of October (after fixing the errors), I'm noticing that the GWT drops 1000 errors/day (which I mark as fixed) and saw some positive trend in search queries.

My Question

The GWT still shows 88k 'Page Not Found' errors (I've fixed all them already) and continues to drop them from the reports at the rate of 1000 per day. But if it's not the large number of errors causing the drop; what are the steps I should take to recover from the traffic drop? I need to be sure about my recovery strategy. I'd welcome your suggestions.


6:36 pm on Dec 16, 2012 (gmt 0)

@tedster: Thank you for your response. As I said, the broken links on our website were Googlebot's discovery because of javascript error in Disqus plugin (on WordPress). The URLs were of the following format :

Correct URL: mydomain.com/correct-URL-1-ends-here/

Phantom URLs created by the JS error (which Googlebot discovered) -


Each of the correct URL got about 3-8 phantom URLs that all resulted into a 404 error. I fixed the problem by redirection all these phantom URLs to the correct URL. Note that the correct URL always had 'canonical' tag with it.

My belief is that since Googlebot discovered a ton of such URLs in a short period of time, it dropped the crawl rate and also lowered our SERPs. My error count is now down from 99k to 16k. I'm hoping that once this goes down to < 1k , Google will gain its trust back in our website and restore our rankings.

Many experts on the GWT however say that such errors won't affect your rankings!


6:43 pm on Dec 16, 2012 (gmt 0)

I can attest to the Disqus issue. Happened to my site big time. Have removed it and now using CommentLuv Premium.


6:46 pm on Dec 16, 2012 (gmt 0)

@Frost_Angel : Did you lose ranking of another site because of this disqus issue? Or it's the same site that you talked about earlier?


7:06 pm on Dec 16, 2012 (gmt 0)

Same site. I meant to mention the Disqus errors where crawl errors we were contending with too. Took us awhile to figure out it was Disqus. They have all been dealt with now. I removed Disqus because even after doing their latest update - the errors began happening again - so it's gone. NO more. Not dealing with it.

The other errors are all errors from the script I installed. The script was removed OVER A YEAR AGO - yet Google continues to find these pages and count them as errors. About two months ago I had 168,000+ due to this long removed script - after continually marking all crawl errors as fixed on a daily basis for the last 3 weeks - I have crawl errors down to less than 42,000 - hoping it continues to go down.


6:42 am on Dec 19, 2012 (gmt 0)


The more I read, the more I doubt my own decisions. I fixed my broken links problem by redirecting them to the correct URLs. Essentially, I've -

1. The correct, canonical marked URL.
2. Several phantom URLs that first returned 404 and now have been redirected to the correct URL. So I have an average of ~4 URLs that point to the same article.

This has helped us get down the error count, but I'm wondering if this *was* the right step in first place. Had I left the broken URLs, I wonder whether Google'd have continued to think that our site offers bad user experience.


Doomed if I do. Doomed if I don't!

Is there a way out?


7:07 am on Dec 19, 2012 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member

TheBigK, I think it takes some time for google to re-crawl those pages again.

We had an issue with our server a few weeks back... DDos attack.. joy... anyhow it made our server inaccessible for about 2 hours. Google WMT errors went through the roof.

For about a week the traffic for our site was WAY off... but started slowly climbing back.

It just takes time.


7:35 am on Dec 19, 2012 (gmt 0)

@Bewenched : Are you sure the traffic drop was in sync with the rise of the crawl errors (page not found) ?

It's been ~3 months that our traffic is down and I wish I know if I'm on the right path. I 'fixed' those phantom broken URLs by redirecting them. Because that was the only 'logical' way to tell Google that the site doesn't really have so many errors.

Now we're down from 99k errors to 13k errors. I'm wondering how long we will have to wait for Google to 'understand' that the errors don't exist!


3:19 pm on Dec 19, 2012 (gmt 0)


Still watching my errors. I'm down now from 168,000 to 28,000. Still no changes in traffic. This was a site that was getting 150,000 uniques a month. Now only about 10,000. I've been dealing with this for 18 months.
Hopefully your situation turns out better.

My errors are all 404 and 410 - as they should be since they haven't existed for over a year.


3:30 pm on Dec 19, 2012 (gmt 0)

18 months is too much. Maybe there's something else that you need to fix?


3:51 pm on Dec 19, 2012 (gmt 0)

I only started working on the crawl errors specifically in October. I ahve worked on Panda overall for 18 months.


3:29 pm on Dec 30, 2012 (gmt 0)


1. I've managed to get the error count down to zero. The phantom URLs that resulted into 404 are all gone from GWT error logs. There are a few valid 404s (because of the low quality pages that I manually removed).

2. Site is completely healthy - there are no other errors on the site.

I saw a spike in the crawl rate on the same day the error count hit 'zero'. But it looks like it's come down again (might go up over coming days).

I'm wondering if large number of broken links was indeed the problem on our site, how long will it be before we can expect Google to restore our rankings?

The general question is: how long does it normally take Google to restore rankings once you've fixed the errors (and Google's acknowledged it through GWT)?


4:10 pm on Jan 8, 2013 (gmt 0)


Have you seen any recovery? I've got my errors down from 168,000+ to 12 right now -- as of yesterday.
Just wondering if you've seen any upward movement or found any proof in crawl errors being associated with rank drop.
This 42 message thread spans 2 pages: 42

Featured Threads

Hot Threads This Week

Hot Threads This Month