Brett_Tabke - 4:40 pm on Jun 3, 2008 (gmt 0) [edited by: tedster at 6:58 pm (utc) on June 3, 2008]
From the Live Search webmaster Blog [blogs.msdn.com]
The de-facto standard for managing this is the Robots Exclusion Protocol (REP) introduced back in the early 1990's. Over the years, the REP has evolved to support more than "exclusion" directives; it now supports directives controlling what content gets included, how the content is displayed, and how frequently the content is crawled.
[edit reason] fix side scroll [/edit]
[edited by: tedster at 6:58 pm (utc) on June 3, 2008]