Welcome to WebmasterWorld Guest from 54.226.89.2

Forum Moderators: phranque

Message Too Old, No Replies

Anyone Know a Good Way to Prevent Google from Seeing Specific Content?

Looking for a way to hide specific text without noindexing the whole page.

     

apaternite

10:37 pm on Oct 17, 2011 (gmt 0)



Hey All,

I have this site where I aggregate user ratings in order to determine the best products. In order to improve the user experience, a few months ago I wrote a script that scraped the most helpful Amazon reviews so my visitors could not only see the best rated products, but also the most helpful reviews on those products.

Bad idea from an SEO standpoint. I got hit with a -950 sitewide penalty/filter for having scraped/duplicate content. Fixed it and added original content, waited several months and the filter was never lifted, so I switched domains. The 301 redirect to my new domain has been successful. The filter has been lifted and rankings are slowly increasing.

I still would like to find a way to provide the most helpful reviews from other websites without that content being found in the source code. Does anyone know the best way to do this? I was thinking some sort of pop up might work, but I'm not sure.

Thanks in advance.

Tony

[edited by: engine at 11:31 am (utc) on Oct 18, 2011]

Dijkgraaf

9:51 pm on Oct 20, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I would think Google would consider this black-hat SEO (cloaking) and you could get a penalty applied if they picked up on it.
But if you want to go ahead, the cloaking forum [webmasterworld.com...] might be your best bet.
A better option would be just to scrape a snippet and then point to the site you got it from for the full review as it sounds like you don't actually have the rights to the material in the first place.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month