Forum Moderators: Robert Charlton & goodroi
Yes it would of been a hacker. Yes it is googles fault they served up a url that was disallowed in robots.txt and always has been.
"Pages can take months to drop out of the SERPs", I have always have had a robots.txt with my cgi-bin disallowed since day one.
On the other hand, Yahoo, will attempt to make a fully indexed SERP by taking anchor text poiinting at the URL and use that as a title in the SERP (unless that anchor text is "click here" or something, and then they will ignore that anchor text).
robots.txt does not give any instructions on what URLs can be listed in a search engine index. It only controls bots, not the listings compiled. If Googlebot finds a link to a URL which it's banned from fetching, that doesn't prevent Google from listing the URL in their index, even if they don't know what's at the other end.