Welcome to WebmasterWorld Guest from 54.234.52.37

Forum Moderators: goodroi

Message Too Old, No Replies

How is correct code within robots.txt?

     

toplisek

9:02 am on Nov 28, 2008 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



I would like to place correct code to block search engines to see content:
User-agent: *
Disallow: /subdomain.mydomain.com/
Disallow: /upload/mydomain/

Is this 100% correct code or there is possible to validate robots.txt?
Need help for this simple question.Thank you

g1smd

1:44 pm on Nov 28, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



No one could possibly know if the code was correct, because we don't know the structure of your site, nor do we know all the URLs that you want to allow and all the URLs that you want to block.

Google WebmasterTools can give you some reports that might be helpful - but it is down to you to interpret what they say.

However, you must put the file in the root of the domain that it needs to apply to, the disallow URLs must contain paths and filenames and must NOT include domain names (unless that's actually the name of a folder).

You might not need to end the disallow statement with the trailing slash, because having the trailing slash in the disallow means that /folder isn't blocked. That may, or may not, be a problem.

toplisek

1:50 pm on Nov 28, 2008 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



Hi,
if I replace subdomain value and I create folders. will it work?

phranque

11:07 am on Nov 29, 2008 (gmt 0)

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



if you have a google webmaster tools account you can test your robots.txt file [google.com] there.