Forum Moderators: Robert Charlton & goodroi
I've never seen this, perhaps for a few hours or a day, but this has been there now since beginning May. (Yes, I know I'm paranoid! Google made me so!)
I did a stack of content type changes to the site during the last month, mainly new and fresh content and changed a few meta description tags.
Hello,
This is my first post. Looking forward to contributing! I have been reading for awhile. Time to jump in!
Several weeks ago I added Sitemap: to my robots.txt and used the URL removal tool to eliminate several outdated and blocked (and also outdated) internal URLs. The removal tool accepted the requests but now the Not Found and/or URLs Restricted by robots.txt web crawl errors are frozen in time (as far back as April 11).
Before I made these changes Google was updating Web Crawl errors every week or so. My homepage is being cached about every 3 days and my robots.txt is returning a 200 every day. I have also noticed that Query stats has not changed in this time while my Sitemaps is being downloaded normally. Does anyone have any ideas how to correct this? Or any ideas in general?
And finally, would this cause my site to become 'frozen' or stuck in the SERPs?
[edited by: tedster at 4:36 pm (utc) on May 8, 2007]