Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Index page and Googlebot missing

Sudden disappearence from Google

         

SEOQuestions

7:15 pm on Dec 19, 2007 (gmt 0)

10+ Year Member



I run a site that has been in existence since 2002. For the last 5 years it has ranked in the top 2-3 spots for all keywords. There is a large, active forum associated with the site. Suddenly in December the site is gone from Google.

A few things that might (?) help figure out what is going on:

If I google the site name without the .com part I rank third (behind even Alexa's site info)

If I google the sitename with the .com it pull up a couple of random internal pages - not the index page.

I notice that the Googlebot went from around 20,000 hits in Oct. and Nov. to 9 in December.

I haven't made any significant changes to the site. The forum is active so there is a lot of new info there and I make minor text/photo modification but that is it.

I have never used any questionable SEO techniques. I don't buy links, etc. I use CSS formatting on my home page but not the internal pages.

Any thoughts? Is there some way to make an appeal to Google?

Thanks everyone!

[edited by: tedster at 7:17 pm (utc) on Dec. 19, 2007]

tedster

7:57 pm on Dec 19, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The sudden change to a low level of googlebot spidering is a definite worry. This sure has the feel of a penalty. Is this the same site you asked about in September [webmasterworld.com]? If so, and if it was hacked, then you may have some bad-news links inserted somewhere that you are not aware of.

Do you have a Webmaster Tools account? You can often pick up clues in there - and the "Reconsideration request" is also available from the Dashboard page in that account.

SEOQuestions

8:14 pm on Dec 19, 2007 (gmt 0)

10+ Year Member



I just signed up for the Google Dashboard and did a web crawl and got the following message for most of my pages including my index page:

robots.txt not reachable

When I clicked on the explaination I got this:

Before we crawled the pages of your site, we tried to check your robots.txt file to ensure we didn't crawl any pages that you had roboted out. However, your robots.txt file was unreachable. To make sure we didn't crawl any pages listed in that file, we postponed our crawl. When this happens, we return to your site later and crawl it once we can reach your robots.txt file. Note that this is different from a 404 response when looking for a robots.txt file. If we receive a 404, we assume that a robots.txt file does not exist and we continue the crawl.

I never uploaded a robots.txt file on my site. Where should I look to see if one exists? (I checked the first level).

Thank so much for your help!

tedster

8:38 pm on Dec 19, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



A robots.txt file would be in the domain root. Just type in www.example.com/robots.txt to your browser- if there is such a file, your browser will display it. Alternately, just use FTP to log in to your server and look for the file.

If there isn't such a file, you might simply place an empty file there, called robots.txt. Or you can include this basic rule that tells all spiders they're welcome to crawl everything.

User-agent: *
Disallow:

Now you do have an explanation for the stopped crawl, at least.

SEOQuestions

8:52 pm on Dec 19, 2007 (gmt 0)

10+ Year Member



Ok - new robots.txt has been uploaded. We'll see what Google does now. Thanks again!

SEOQuestions

10:12 pm on Dec 20, 2007 (gmt 0)

10+ Year Member



Update -

Google tools are now indicating that they are successfully accessing my robots.txt file but there is an error message that they cannot "currently access your home page because of a timeout". My hope is that the site hasn't been crawled yet but we shall see . . . (No change in rankings - still MIA).

tedster

1:18 am on Dec 21, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



In a situation like this, I'd suggest watching the server logs very closely.

SEOQuestions

2:01 am on Dec 21, 2007 (gmt 0)

10+ Year Member



What would you be looking for? I have been checking for the googlebot (the last visit was on the 18th). Is there anything else you would keep an eye out for?

Thanks again.

tedster

2:24 am on Dec 21, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Loook for what urls googlebot asks for and what your server's response is. See how the Webmaster Tools reports line up. If your server shows a similar error to what GWT reports, then invesitgate why the server bugged out. You may be able to pinpoint a major spidering issue with there. If your server shows a 200 OK and GWT says there is a problem, then at least you've got some sense that it's a temporary googlebot problem.