Forum Moderators: phranque

Message Too Old, No Replies

Why ROBOTS.TXT does not function?

         

toplisek

12:00 pm on May 14, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Why ROBOTS.TXT does not work?

I have put
User-agent: *
Disallow: /

All subdomains are seen in google even I put this...
Is issue that anybody hacker can see actual website and he just puts google index to start search indexing...

jdMorgan

12:51 pm on May 14, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If the subdomains each have their own filespace on your server, then each of them needs a robots.txt file, for example

www.example.com/robots.txt
sub1.example.com/robots.txt
sub2.example.com/robots.txt

And note that the filename must be "robots.txt" and not "robot.txt" and not "Robots.txt" or any other case or spelling variation... It must be exactly "robots.txt" only.

Do be sure that both robots.txt and any custom 403 error document you may have defined are accessible by any user-agent from any IP address, and not subject to any "access control" or "IP bans." If these files are not completely-accessible, then various potentially-serious problems will ensue.

Also make sure that you created your robots.txt file with a plain-text (ASCII) editor such as Windows Notepad.

Jim