Forum Moderators: open
Any ideas on what could have gone wrong? The robots.txt file passes the validation test so I don't think thats it (besides its never been changed).
Here's a snippet from my log.
64.68.87.43 - - [22/Aug/2003:00:54:17 -0500] "GET /robots.txt HTTP/1.0" 200 820 "-" "Googlebot/2.1 (+http://www.googlebot.com/bot.html)"
64.68.87.43 - - [22/Aug/2003:00:54:17 -0500] "GET /images/mint/dot.gif HTTP/1.0" 200 67 "-" "Googlebot/2.1 (+http://www.googlebot.com/bot.html)"
64.68.82.7 - - [22/Aug/2003:02:03:34 -0500] "GET /robots.txt HTTP/1.0" 200 820 "-" "Googlebot/2.1 (+http://www.googlebot.com/bot.html)"
64.68.82.7 - - [22/Aug/2003:02:03:40 -0500] "GET / HTTP/1.0" 200 34036 "-" "Googlebot/2.1 (+http://www.googlebot.com/bot.html)"
64.68.86.79 - - [22/Aug/2003:02:11:47 -0500] "GET /robots.txt HTTP/1.0" 200 820 "-" "Googlebot/2.1 (+http://www.googlebot.com/bot.html)"
64.68.86.79 - - [22/Aug/2003:02:11:47 -0500] "GET /images/mint/dot.gif HTTP/1.0" 200 67 "-" "Googlebot/2.1 (+http://www.googlebot.com/bot.html)"
I don't think you have anything wrong. I have several sites with 25,000+ pages indexed.....they don't change very often.....but Googlebot keeps coming back for them.
It grabs about 200 new pages per day from each site while visiting....so I don't have any real reason to complain. But I would prefer it to grab the new pages only and quit burning the data transfer for those that stay the same.
It costs me about $0.80 per day in data transfer.....and makes me about $1,800 per day in free SERPs....As a perfectionist I would prefer it got it right.....but I'm not exactly going to complain about the minor errors it makes right now :)