Welcome to WebmasterWorld Guest from

Forum Moderators: phranque

Message Too Old, No Replies

Anyone Know a Good Way to Prevent Google from Seeing Specific Content?

Looking for a way to hide specific text without noindexing the whole page.

10:37 pm on Oct 17, 2011 (gmt 0)

New User

5+ Year Member

joined:Sept 20, 2011
posts: 2
votes: 0

Hey All,

I have this site where I aggregate user ratings in order to determine the best products. In order to improve the user experience, a few months ago I wrote a script that scraped the most helpful Amazon reviews so my visitors could not only see the best rated products, but also the most helpful reviews on those products.

Bad idea from an SEO standpoint. I got hit with a -950 sitewide penalty/filter for having scraped/duplicate content. Fixed it and added original content, waited several months and the filter was never lifted, so I switched domains. The 301 redirect to my new domain has been successful. The filter has been lifted and rankings are slowly increasing.

I still would like to find a way to provide the most helpful reviews from other websites without that content being found in the source code. Does anyone know the best way to do this? I was thinking some sort of pop up might work, but I'm not sure.

Thanks in advance.


[edited by: engine at 11:31 am (utc) on Oct 18, 2011]

9:51 pm on Oct 20, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:May 31, 2005
votes: 0

I would think Google would consider this black-hat SEO (cloaking) and you could get a penalty applied if they picked up on it.
But if you want to go ahead, the cloaking forum [webmasterworld.com...] might be your best bet.
A better option would be just to scrape a snippet and then point to the site you got it from for the full review as it sounds like you don't actually have the rights to the material in the first place.

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members