Welcome to WebmasterWorld Guest from 54.196.232.162

Message Too Old, No Replies

How to Have Google Not Consider One Part of a Page

Is there a way to exclude part of a page from being used in indexing

     
5:52 pm on Mar 9, 2007 (gmt 0)

Junior Member

10+ Year Member

joined:Apr 6, 2002
posts:180
votes: 2



I added a comments section to my articles and some of the comments are probably not helpful for SEO purposes.

I include the comments at the bottom of the page.

Is there a way to communicate to Google that they should not use the comments on the page as part of their indexing exercise?

In other words, I want them to use the text of the article to determine where the page ends up in SERPs, but not the text from the comments.

Thanks.

6:31 pm on Mar 9, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 31, 2002
posts:25430
votes: 0


There's no direct way to do it, but you could use PHP or SSI to omit the content if the user-agent/IP address indicates a request from Googlebot, or even use mod_rewrite or ISAPI Rewrite to server a different page to the 'bot.

This describes a technical solution. Take care however, whether the search engines might consider what you intend to do as "cloaking with intent to deceive your visitors" or you risk being expelled from their indexes. Only your review of their Webmaster guidelines in the context of your page content and intent can determine whether this is the case.

Jim

6:36 pm on Mar 9, 2007 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member encyclo is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Aug 31, 2003
posts:9063
votes: 2


You can also consider adding the comments via a borderless
iframe
which is either excluded via robots.txt or contains a robots noindex meta element.
6:39 pm on Mar 9, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member jomaxx is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Nov 6, 2002
posts:4768
votes: 0


If it were critical you could also write out the comments using Javascript.

But in this case I suspect you're overthinking the issue. It seems likely to me that the comments would tend to add a variety of related keywords that could make your site friendlier to long-tail searches.

6:56 pm on Mar 9, 2007 (gmt 0)

New User

10+ Year Member

joined:May 5, 2005
posts:40
votes: 0


I was wondering about this same issue for a disclaimer that is a paragraph long and on every single one of our news pages.
7:06 pm on Mar 9, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:July 26, 2006
posts:1619
votes: 0


couldn't you do it as an iframe, i believe they are considered a secondary page
7:10 pm on Mar 9, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


eddytom, that is what has been called "boilerplate repetition" by Google's Adam Lasnik -- see this thread:

[webmasterworld.com...]

If you do have a long paragraph of boilerplate on many pages, you probably can help yourself by addressing it in some way. Yes, Google indexes urls, not pages, and an iframe holds a different url. I've also seen the boilerplate text turned into an image to address this challenge.

9:39 pm on Mar 9, 2007 (gmt 0)

Junior Member

10+ Year Member

joined:Apr 6, 2002
posts:180
votes: 2


Thanks for the great feedback.

I think I will simply leave the comments in for now and see how things progress.

If it helps with long tail searches, great. If not, I will iframe the comments section of the page with a noindex iframe.

Again, greatly appreciate the help.