Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Site went through many changes - how to regain Google trust?

         

apauto

7:40 pm on Mar 27, 2011 (gmt 0)

10+ Year Member



Hi guys,

We have a site that has gone through a lot of changes in the past 6 months. Due to this, WMT has a ton of 404s.

I've been letting the site "marinate" so Google returns the site back to it's good performance in SERPs, but it seems that when I do a site: the new, good content pages don't show until deep in the results, and all that shows are old pages that should be 404'd.

I was thinking of blocking the 404 pages in Robots. Suggestions?

Has anyone regained Google trust on a site that went through a lot of changes like this, and then ranked low for long tail? If so, what did you do?

I like the domain name, and want to build it based on the new idea I have, but need to get Google to trust me again.

I also tried a reinclusion request a few days ago.

Thanks

tedster

8:24 pm on Mar 27, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The first thing is to make 100% sure that you are not linking to those 404 URLs yourself - not on the site and not in the sitemap. The next thing is to consider if 301 redirects are appropriate in each case. And if you are sure a URL should be gone, serve a 410 response to speed things along.

I would not disallow the 404 URLs in robots.txt - that's like saying "these URLs exist but don't request them." Instead nail down any and all technical and canonical issues so they are totally air tight and then wait. This thread may be a big help on things to look out for: Site Relaunch Checklist [webmasterworld.com]

I recently took a large site through exactly this process (it had 3 times as many "kill pages" as "keep pages") and their search traffic had increased after two weeks to more than what it was before the site was re-launched.

walkman

9:07 pm on Mar 27, 2011 (gmt 0)



You can also ping google blog search with the deleted pages. They can come and see that they don't exist. And submit a sitemap with e list of 404 pages.

tedster

9:17 pm on Mar 27, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



urlwalkman - have you actually done that with a good result? I mean, submitted a Sitemap with your 404 URLs and seen it help removal? I ask, because I always felt that a perfectly "clean" xml sitemap was the best way to go (all the 200 OK URLs, with no redirects or 404s and no canonical issues).

walkman

9:24 pm on Mar 27, 2011 (gmt 0)



Tedster, works like magic. Eventually I remove the 404 maps and leave the good ones. The idea is that you WANT Google to go to /deleted-page.html and see that it's gone, even if takes a ping or sitemaps. After a few visits they realize that it's really gone, 2 weeks or so for me after using a mixture of noindex and 404. I even use it for those pages deleted on Webmaster Central.

crobb305

10:27 pm on Mar 27, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I suppose you have too many urls to 410, so that may be impractical. Like Ted, I tend toward a perfect Sitemap.xml, but my sites are small enough to submit individual 410s to the removal tool. When I remove a page, I do a code search for my whole site to ensure there is NOTHING that links to the dead url. This is important I believe, but I have seen Google fetch 404s for a year or longer, as long as there are inbound links.