Msg#: 4191424 posted 2:35 pm on Aug 24, 2010 (gmt 0)
I recently read (though for the life of me I can't remember where) that the meta name="robots" tag is first-come first-serve.
The example given was something like: <meta name="robots" content="noindex, nofollow"> <meta name="googelbot" content="index, follow">
What was pointed out was that since "robots" was before "googlebot", then googlebot ignored the second set of instructions and didn't index the content or follow the links on the page, as per the "robots" instruction.
Has this been confirmed? I have no reason to doubt it, but it is relevant for my real question.
A competitor has the following code on his page: <head> ... <meta name="robots" content="index, follow"> ... </head> <body> ... boring yet still relevant content with <a rel="nofollow" href="/directory/page/linked-text.html">linked text</a> and other text about widgets. ... </body>
How would googlebot handle the linked text in the example above? There are conflicting arguments: FOLLOW and NOFOLLOW.
I make it a rule never to use the follow directive because it 1) seems stupid to tell a robot to do it's job and 2) seems to cause issues like this. However, I am trying to understand how this works.
Msg#: 4191424 posted 7:37 am on Aug 25, 2010 (gmt 0)
The default is follow. Nofollow overrides the default.
However, that doesn't answer the question. If I have two conflicting robots-meta tags in the head of my document - it's first come first serve. If I force a follow in the head, will that override the nofollow on the page?
Msg#: 4191424 posted 3:13 pm on Aug 25, 2010 (gmt 0)
Exactly so, suratmedia, you've got it right. The most granular directive (rel=nofollow) overrides the more general directive (meta robots for the entire page. So it does parallel the CSS specificity rules - except it's simpler ;)