Welcome to WebmasterWorld Guest from 50.17.5.36

Forum Moderators: martinibuster

Message Too Old, No Replies

Yahoo! Search Support for X-Robots-Tag Directive to Simplify Control

     
7:58 pm on Dec 7, 2007 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month Best Post Of The Month

joined:May 9, 2000
posts:22305
votes: 239


Specifically, we're extending our support of page level exclusion tags -- NOINDEX, NOARCHIVE, NOSNIPPET, NOFOLLOW -- to provide additional control for archiving and summarization of ANY file type. Previously, these page level tags could only be expressed within html pages through the META directive (for e.g. <META NAME="Slurp" CONTENT="NOARCHIVE">), but based on feedback from our webmasters, Yahoo! now enables these tags to be expressed through X-Robots-Tag directive in the http header, giving webmasters the flexibility to achieve exclusions on PDF, Word documents, PowerPoint, video, and other file types, including html files, and increasing their coverage through a simplified process. Additionally, webmasters no longer need access to html templates in order to express exclusions for html files.

Webmasters can use this control by adding it at page level tags to the X-Robots-Tag directive in the HTTP Header.

Yahoo! Search Support for X-Robots-Tag Directive to Simplify Control [ysearchblog.com]

3:38 pm on Dec 9, 2007 (gmt 0)

Full Member

10+ Year Member

joined:May 18, 2002
posts:210
votes: 0


Now, any site can place a no follow directive in the header.

Anyone trading or purchasing links will have to check the http headers to see if any link juice is actually being passed.