Why is it that Google ignores changes to my Robots.txt ?
This file gets spidered successfully each and every day, yet I still see certain files that are "restricted by Robots.txt" in WMT, which haven't been included in robots.txt in many, many months. They were once prohibited in robots.txt, but those definitions have long been removed, and there is no chance that any sort of pattern matching is prohibited them from being crawled via robots.txt. Additionally, I have pumped my entire robots.txt file into their test utility, and these files are never denied, using this simulation tool.
Anyone else have this problem ? Is there a remedy here ?