Welcome to WebmasterWorld Guest from 22.214.171.124
Forum Moderators: goodroi
However, last month it took about half a gig of bandwidth, and so far this month it's on course for almost twice that.
I put a robots.txt in place about a week ago. I checked it carefully, validated it, and waited. No joy. It's still taking posting, log-on and search pages I've tried to disallow. I checked on Google's help pages, and they seem to say that the robots.txt should get read every 24 hours - so it should have kicked in by now.
Any ideas? Anyway I can force the Googlebot to read my robots.txt?
This is what I'm using. I want to restrict all bots to only reading index, forum and topic pages so I disallowed everything else.
No you were, it was I that missed that bit :)
I had a similar problem with images but I was still getting request from the G image serach. It seems to be less and less now. You have to wait for the index to update I would guess. So that all the old data has been replaced by new. What will the update mania I tend to wait and see how things progress. I know that is no real helkp for you right now but I am sure it will work in the end.
Ahh, do you see request for the robots.txt in your logs. If you do then bingo, you know it is read, you will just have to wait it out.
May I suggest:
The robots.txt validator -- like most other validators -- indicates that the 'code' is valid, and not that it will do what you desire it to do.
Disallowing /xyz.php will also disallow /xyz.php?anything