Forum Moderators: goodroi
at 3:30pm Googlebot is still requesting those pages.
how long until G takes another look at my robots.txt? Or do I need to prompt it somehow? Hey G-bot, please take a look at my robots.txt, I changed it... please stop crawling...
I could put some cloaking rules in my .htaccess, but that seems heavyhanded IMHO
Anyone have experiences to share?
The robots.txt disallow says to not spider the pages. The URLs can still appear as URL-only entries in SERPs.
If you don't want anything to appear in the search results, not even a URL-only entry, then you are better off using the meta robots noindex tag instead. That takes effect immediately the page is spidered.