Welcome to WebmasterWorld Guest from 18.104.22.168 , register , free tools , login , search , pro membership , help , library , announcements , recent posts , open posts Become a Pro Member
robots.txt not working Disallows are still being indexed jchance
About 5 months ago I added a line to my robots.txt file to stop google from indexing a page. However, this page still exists in Google and googlebot comes and grabs the page every day. Any ideas what I'm doing wrong? Below is my robots.txt file:
Should you not have a forward slash in front of the disallowed page?
Apart from that am at a loss :(
Or maybe you put the robots.txt file in a directory other than your root web directory.