Forum Moderators: Robert Charlton & goodroi
Does the word "network" refer to my website or the physical server network? In other words, am I doing something that's blocking googlebot or is my server blocking it?
I know of two cases where this message just cleared up on its own.
Out of SERPs? I'm not really sure unreachable robots.txt can do that but unreachable pages CAN.
Check your .htaccess not to block google and check the headers returned when you try to download robots.txt with external tools found free online.
{edit} I wrote eternal instead of external. Really tired and really late!
[edited by: TheSeoDude at 11:38 pm (utc) on Aug. 28, 2007]
RewriteEngine on
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.php\ HTTP/
RewriteRule ^index\.php$http://www.example.com/ [R=301,L]RewriteCond %{HTTP_HOST} ^example.com [NC]
RewriteRule ^(.*)$http://www.example.com/$1 [L,R=301]
A few more questions to help me understand things better...
I wonder... is there a way to specifically state in my .htaccess that I allow googlebot to access my site?
I've been getting Unreachable URLs for about 3 weeks (maybe longer) now in my sitemaps report. Details for all of them state "robots.txt unreachable". Am I in deep trouble?
I have not updated my site for about a month because I wanted to solve this problem first. Then it became a fear that adding more content would screw things up more.
Could Google have interpreted my lapse in adding new content as my site being dead?
I admit that I don't really have "quality" links going into my site (I'm finding it hard to cope with my home biz and managing my sites). Could Google have flagged my site as "not important" and cause this error?