Forum Moderators: goodroi
In the Robots.txt I had placed "Disallow: /abcfolder/"
and thought that would stop Google from spidering all of the files/pages within the folder of /abcfolder/
but it doesn't seem to have worked as they are appearing in my website AW stats
I have a file (which is a php global page template with errors in it) that I don't want spidered.
it is essentially /abcfolder/xyx.php
what syntax do I use to stop it being spidered? and/or indexed?
Thanks :)
[edited by: Gemini23 at 8:21 pm (utc) on Dec. 5, 2008]
User-agent: Googlebot
Disallow: /abcfolder/
And if you want to block all bots use the wildcard:
User-agent: *
Disallow: /abcfolder/
If you do decide to use the meta robots instead...
<meta name="robots" content="noindex">
...and do not block the file in robots.txt. This is so bots will be able to see and read the noindex.