| 6:31 pm on Mar 9, 2007 (gmt 0)|
There's no direct way to do it, but you could use PHP or SSI to omit the content if the user-agent/IP address indicates a request from Googlebot, or even use mod_rewrite or ISAPI Rewrite to server a different page to the 'bot.
This describes a technical solution. Take care however, whether the search engines might consider what you intend to do as "cloaking with intent to deceive your visitors" or you risk being expelled from their indexes. Only your review of their Webmaster guidelines in the context of your page content and intent can determine whether this is the case.
| 6:36 pm on Mar 9, 2007 (gmt 0)|
You can also consider adding the comments via a borderless
iframe which is either excluded via robots.txt or contains a robots noindex meta element.
| 6:39 pm on Mar 9, 2007 (gmt 0)|
But in this case I suspect you're overthinking the issue. It seems likely to me that the comments would tend to add a variety of related keywords that could make your site friendlier to long-tail searches.
| 6:56 pm on Mar 9, 2007 (gmt 0)|
I was wondering about this same issue for a disclaimer that is a paragraph long and on every single one of our news pages.
| 7:06 pm on Mar 9, 2007 (gmt 0)|
couldn't you do it as an iframe, i believe they are considered a secondary page
| 7:10 pm on Mar 9, 2007 (gmt 0)|
eddytom, that is what has been called "boilerplate repetition" by Google's Adam Lasnik -- see this thread:
If you do have a long paragraph of boilerplate on many pages, you probably can help yourself by addressing it in some way. Yes, Google indexes urls, not pages, and an iframe holds a different url. I've also seen the boilerplate text turned into an image to address this challenge.
| 9:39 pm on Mar 9, 2007 (gmt 0)|
Thanks for the great feedback.
I think I will simply leave the comments in for now and see how things progress.
If it helps with long tail searches, great. If not, I will iframe the comments section of the page with a noindex iframe.
Again, greatly appreciate the help.