|Forum: Sitemaps, Meta Data, and robots.txt|
|Displaying Topics 1 - 40 (9 total) Sorted by: Thread-Subject, Direction: forward|
|1:|| Add Sitemap To robots.txt For Autodiscovery|
"...you can now find your sitemaps in a uniform way across all participating engines. To do this, simply add the following line to your robots.txt..."
|Apr 13, 2007|
|2:|| Cloaking Your robots.txt||Oct 28, 2003|
|3:|| Google's Current Specifications for Robots Directives|
A very cool reference. Google's exact technical specification for handling robots.txt, robots meta tags and x-robots directives
|Nov 24, 2010|
|4:|| In the era of increasing numbers of bad bots is robots.txt irrelevant?|
"I just read through a number of posts in an attempt to understand what, if anything, can be done about bad bots."
|Feb 20, 2006|
|5:|| Proposal for robots.txt To Have Greater Flexibility|
"The desire for greater control over how search engines index and display Web sites is driving an effort by leading news organizations and other publishers to revise a 13-year-old technology for restricting access."
|Nov 29, 2007|
|6:|| SonicWALL Firewall Blocks Spiders|
"Wondering why your website isn't being spidered? Your firewall may now be part of the issue."
|May 3, 2005|
|7:|| When Robots.txt And Meta Robots Collide|
"WebmasterWorld members clarify the priority and use of meta name="robots" content="ALL" & robots.txt"
|Mar 8, 2007|
|8:|| Why Should I Have A Robots.txt File?||Dec 31, 2006|
|9:|| Yahoo! Slurp Now Supports Wildcards in robots.txt|
You can now use '*' in robots directives for Yahoo! Slurp to wildcard match a sequence of characters in your URL. You can use this symbol in any part of the URL string you provide in the robots directive.
|Nov 6, 2006|