Welcome to WebmasterWorld Guest from 54.161.187.250

Message Too Old, No Replies

Google thinks my site is "Unreachable" and isn't listing it, any ideas

   
6:12 pm on Jun 28, 2010 (gmt 0)



domain.com
www.domain.com

domain.com has a 301 redirect to www.domain.com

The server is very much online and hasn't been offline for a long time now, if I test domain.com Google reports that it's got a redirect to www.domain.com

However when I test "www.domain.com" it reports "Unreachable" and because of this the forum hasn't been indexed at all, it's also unindexed in Yahoo and the other popular search engines.

Does anyone know what could be causing this? Another site running from the same server works fine (indexed). The sites both aren't low traffic (couple of thousand uniques a day) so it's not anything to do with being unworthy of indexing.

Is it possible the site has been flagged as spam, if so would Google report it as "Unreachable"?
8:24 pm on Jun 28, 2010 (gmt 0)

5+ Year Member



Have you tried the "fetch as googlebot" functionality? Maybe there are some IP specific blocks in place on your server.
8:44 am on Jun 29, 2010 (gmt 0)



my guess is that because you can get in, but NO bots can, you're either inadvertently blocking the bots in your htaccess file, your denial of service protection code (if you have one), or your host is blocking them at a firewall level. You may also have a bad robots.txt file that disallows the whole site from indexing.
12:43 pm on Jun 29, 2010 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Usually a robots.txt Disallow is reported as that. For a 403, googlebot had to query your server and actually get that status FROM the server. With a robots.txt Disallow, googlebot normally doesn't even make the request.