homepage Welcome to WebmasterWorld Guest from 54.211.80.155
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
How long should I wait - before I get thumbs up from Google?
TheBigK




msg:4535604
 4:22 am on Jan 13, 2013 (gmt 0)

Frankly speaking, I didn't like the title I just wrote for this post. But our years of dependence on Google began 'paying off' in September as we lost our traffic. Starting 4-5 September last year, traffic to our forums dropped ~50%. I found out that starting 28 August our website was generating a TON of internal broken links, because of a JavaScript bug in WordPress Disqus plugin. Our site's got ~400k listed URLs and the 404 "Page Not Found" count rose to 99k over a period of 20 days (by the time we discovered the root cause of the problem). I found that Google Bot dropped crawl rate of our website by about 90% at the same time.

I fixed the problem immediately and all the error URLs that Google Bot found through Javascript bug, were 301 redirected to their correct URLs (which have the canonical tag, properly setup). Subsequently, GWT dropped the error count to zero by the end of December and in the first week of January, we found that Google Bot started crawling our website at increased frequency. Traffic improved by about 20%, but it's still faraway from where I should have been.

Google clearly says that 404s do not affect rankings or crawl rate; but we noticed the opposite. I've checked, re-checked & double-checked if there's anything 'out of place' on the site. I found nothing. Second, we never participated in anything blackhat.

The Question Is-

How long should I wait before Google might just restore our rankings? I see a ton of crappy websites outranking us for our original content. I'm assuming that it was the large number of internal broken links that caused our rankings to go down. Any opinions, views, feedback would be greatly appreciated.

 

raymondcc




msg:4535625
 7:11 am on Jan 13, 2013 (gmt 0)

I used to have automated translation plugin for Wordpress and vBulletin which was later removed. Obviously that quickly caused over a hundred thousand 404 errors in GWT which shortly after the traffic starting dropping like crazy.

I can't really remember exactly how long I waited but it was more than 6 months for the traffic to stop dropping. Recovery wasn't a lot as well because my site was hit by Panda.

When Google said 404 do not affect rankings, probably they meant a few 404s, not thousands of them in a short duration.

What I did to fix the 404s from the GWT is to block Googlebot from visiting the invalid pages using robots.txt. If the invalid pages are found in SERPS, I used the "Remove URLs" from GWT to remove them. The number of 404s in GWT starts dropping and now it's around 5000+ which is so much better than 100,000+.

TheBigK




msg:4535628
 7:21 am on Jan 13, 2013 (gmt 0)

Thanks a lot for your reply, @raymondcc. It's already 4th month for us after the errors have been fixed. Google's dropped 404 errors from GWT and the site's clean. In our case, the traffic dropped overnight, as the error count shot up from 7k to 17k and then to ~36k.

I must mention that we never were hit by Panda or Penguin. So, would you still suggest a time-frame we should be looking at? Or should I really continue to look for possible sources of problem?

How many weeks are usually 'several weeks' in Google's calendar?

Sgt_Kickaxe




msg:4535641
 10:12 am on Jan 13, 2013 (gmt 0)

My site was never hit by Panda or Penguin either but I was lax in how I handled image hotlinkers and content scrapers so I did see an increase in links to my site using broken urls that, unfortunately resolved. Eventually I noticed that some pages were being outranked by these scrapers so I took DRASTIC action.

- I turned off my CMS function that was allowing badly formed urls to redirect and then I went and, using htaccess, made sure that if you didn't visit an EXACT url you would get a 404 on purpose. Overnight I created 20,000+ 404's which signified that I had a bigger scraper problem than I anticipated. ALL manner of broken url started showing up in GWT and traffic fell 50% immediately.

The thing is my ranked pages never faltered, they held their rank and the number of indexed pages hasn't changed, nor have the pages which were indexed. How I lost 50% of traffic from Google without my rankings changing is beyond me still.

Recovery - I'm still down 40% two months later, meaning 10% has returned. I don't care either, I rather like knowing that my site has zero low quality pages or urls that resolve or redirect which I don't want. It's super clean now.

What I did to fix the 404s from the GWT is to block Googlebot from visiting the invalid pages using robots.txt
I don't recommend this, Google will now index all of the urls you've blocked and will add a "due to robots.txt we can't tell you what's here" type description. If the page is 404 and you are returning a 404 error that's 100% what you want to be doing. No redirects, no blocking, just 404s. Perfect.
TheBigK




msg:4535642
 10:19 am on Jan 13, 2013 (gmt 0)

One thing is sure - large number of broken links affect the rankings, no matter what the Google folks say.

My next move is going to be making the pages that used to get traffic (and now are down in SERPs) more content rich and see if they get their rankings back.

raymondcc




msg:4535644
 10:27 am on Jan 13, 2013 (gmt 0)

I don't recommend this, Google will now index all of the urls you've blocked and will add a "due to robots.txt we can't tell you what's here" type description.


That's when I used the "Remove URLs" from GWT to remove them from SERPS.

netmeg




msg:4535658
 1:37 pm on Jan 13, 2013 (gmt 0)

The problem is, your bad URLs may only be part of the problem. If everything has been fixed for four months and you aren't seeing any improvements, I'd be looking around for other possibilities. When you have a situation like that, it's easy to think it's the entire reason for loss of traffic, when it may not be.

TheBigK




msg:4535660
 1:43 pm on Jan 13, 2013 (gmt 0)

@netmag: It looks like that is the reason, because I've been looking for other reasons and haven't been able to find any. It's just been a few days the error count touched zero. Maybe I should wait for a few days more.

Awarn




msg:4535680
 3:28 pm on Jan 13, 2013 (gmt 0)

I see this in a different way. I see Google treating us as like we fell down a flight if stairs. They allow you to come back up but only a couple stairs at a time and then you have wait and rest. Then you can go up a couple more. I bet your graph looks that way too. Your pages not selected graph probably is still fluctuating and gradually back to where it was months ago.

TheBigK




msg:4535683
 3:40 pm on Jan 13, 2013 (gmt 0)

The 'Not Selected' pages in our case 'dropped' significantly, which I believe is a good sign. I think I'll just wait and watch.

backdraft7




msg:4535694
 3:52 pm on Jan 13, 2013 (gmt 0)

They allow you to come back up but only a couple stairs at a time and then you have wait and rest. Then you can go up a couple more.


Unless of course you have a big pile of VC $$$! Then G will THROW you UP the stairs!

TheBigK




msg:4535699
 4:10 pm on Jan 13, 2013 (gmt 0)

One thing that bothers me is - Why do Google (at least their avatars tho are active on the forums) say that broken links, 404s don't affect rankings and crawling? There are *several* evidences that show that they are related! I've had several webmasters tell me that they noticed the sync between broken links and lesser crawling & traffic.

I think I'll have to wait at least 2 more months before I can expect some bounce in the traffic.

Robert Charlton




msg:4535789
 1:22 am on Jan 14, 2013 (gmt 0)

Why do Google (at least their avatars tho are active on the forums) say that broken links, 404s don't affect rankings and crawling?

I wouldn't assume that this is exactly what Google is saying. If you're somehow creating the broken links and other 404s on your site, Google isn't going to like that because you're creating a situation that that wastes Google resources, and may ultimately reduce the number of useful pages on your site that can get spidered.

Analogous but not exactly the same, soft 404s (error pages which should be returning 404s but are in fact returning 200s), can hurt you. That's potentially an infinite number of urls returning "200 OK" responses. I think a large number of canonical errors (returning 200s) can also eventually hurt you.

What Google is saying that you shouldn't worry about are 404s that are not related to you... eg, from fragmentary urls that Google sees on other domains and is crawling. Google reports those in WMT, as "errors", so you know about them and can see if there's anything broken on your site.

For the above reasons, you should be returning 404s (or 410s) when called for... and if something in your site setup is somehow generating a great many 404s, you need to look at that. That doesn't necessarily generalize, though, to 404s from all sources hurting you, or even to 404s from a lot of urls you might naturally drop (as with classified ads) hurting you. (I think you can figure that if Google was crawling the pages in the first place, it will crawl the 404ed urls as well).

If you are able to implement 410s rather than 404s, return 410s for urls that you know are going to stay out of the index. Take a look at this discussion...

Best way to tell Googlebot a page doesn't exist anymore
http://www.webmasterworld.com/google/4261092.htm [webmasterworld.com]

TheMadScientist




msg:4535816
 4:11 am on Jan 14, 2013 (gmt 0)

What Robert said (and I think I've said something to the effect of previously in one of your threads) ... In this one, my advice is:

Leave It Alone!
Google's Rank Modifying Patent for Spam Detection [webmasterworld.com]

Don't try to fix anything that's not broken. Go back to managing your site and focusing on improving for visitors from where you are now, Do Not make changes for Google or based on rankings ... Go focus on your visitors and their satisfaction and social interaction and things along those lines, but don't F*ck with anything you've fixed or anything you don't see as broken to try and influence rankings ... Let rankings 'happen' like you likely did initially.

While you're doing that, forget about Google and focus on visitors, because if you start 'reacting to Google' after you've fixed things, their algo will just re-tank you, so if you know everything's fixed, forget about Google and rankings and focus on current visitors and finding new ones from other sources.

[edited by: TheMadScientist at 4:31 am (utc) on Jan 14, 2013]

TheBigK




msg:4535817
 4:22 am on Jan 14, 2013 (gmt 0)

@Robert: I understand what you're saying. I explained the situation to them through GWT forums and they say that the broken links won't hurt your rankings at all. But there's no point in debating on it, I think. It's Google and they're the BOSS.

That said, I'm going ahead with no more fixes. Will just observe what happens to the site & the visitors while continuing to build my site the way we did.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved