Welcome to WebmasterWorld Guest from 54.166.167.206

Forum Moderators: goodroi

Message Too Old, No Replies

Robots.txt Content

     
10:44 pm on Feb 21, 2010 (gmt 0)

5+ Year Member



I created a robots.txt file and added this content to it:

sitemap: http://example.com/sitemap/sitemap.xml
User-agent: *
Disallow: /enable-cookies
Disallow: /provacy-policy

Does this seem correct. I'm trying to show the search engines where my sitemap is and block those 2 pages from being crawled.

The 2 pages I want to exclude are included in the sitemap (it gets automatically generated by my cms)

Thanks.
11:22 am on Feb 22, 2010 (gmt 0)

WebmasterWorld Senior Member penders is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



sitemap: http://example.com/sitemap/sitemap.xml
User-agent: *
Disallow: /enable-cookies
Disallow: /provacy-policy


I have only seen 'Sitemap:' (with a capital 'S') - although I'm not sure whether it is case-sensitive or not? (All other directives have a capital first letter)

I would also put the Sitemap: directive last (and separated by a blank line - used to delimit records in robots.txt). I have read that not all robots support the Sitemap: directive, so in order to prevent these bots from prematurely aborting processing of the robots.txt file it should appear last.

The 2 pages I want to exclude are included in the sitemap (it gets automatically generated by my cms)


IMHO, if they are disallowed in robots.txt then the search engine should be prevented from accessing them, regardless of whether they are linked to from elsewhere, or included in your sitemap - but I don't know for sure; just my opinion. Ideally they should not be in your sitemap.
1:54 pm on Feb 23, 2010 (gmt 0)



Why don't you use Google webmaster tool (www.google.com/webmasters/tools/) to submit the sitemap. I work more effectivly and fast when compared to robots.txt file.

Hyder
<snip>

[edited by: goodroi at 2:21 pm (utc) on Feb 23, 2010]
[edit reason] Please no signature links [/edit]

2:03 pm on Feb 23, 2010 (gmt 0)

5+ Year Member



Because this will only submit it to google search engines. Robots.txt will allow other search engines to find it.
11:50 am on Feb 25, 2010 (gmt 0)

WebmasterWorld Senior Member penders is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



Just a thought... with the sitemap: directive in robots.txt do you still need to resubmit it to the search engines (via HTTP request for example)?

Since if you use Google Webmaster Tools to submit the sitemap in the beginning, you still need to resubmit it when it changes.
11:56 am on Feb 26, 2010 (gmt 0)

5+ Year Member



My cms submits generates and submits the sitemap automatically to google and yahoo.
1:33 am on Apr 9, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I am missing a robots.txt file in my site

can you please tell me if it is n ecessary and why? I"m confused

thank you
8:47 am on Apr 9, 2010 (gmt 0)

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



by default the search engine bots will crawl your site unless excluded by some technical method.
a missing or empty robots.txt file is equivalent to permission to crawl.
an empty robots.txt is preferable since the frequent requests for that file will return a 200 OK status code response instead if a 404 Not Found.
if you wish to exclude some or all bots from crawling a part of your site, the robots.txt file is one of the methods available.
11:12 am on Apr 9, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



ok, I got it, thank you
 

Featured Threads

Hot Threads This Week

Hot Threads This Month