danijelzi - 2:35 pm on Apr 20, 2011 (gmt 0)
stop complaining and get busy stopping the scrapers.
incrediBILL, i'm not actually complaining about that someone scrap my site. All my complaints go to the fact that after Panda for some keywords im my niche SERPs look like:
#1 scrapper, adds all over the page
#2 scrapper, adds all over the page
#3 scrapper, adds all over the page + malware
#20 relevant site, original source.
I don't want Google to investigate who first wrote an article, I just want them to provide people who search with at least normal user experience, not loads of totally irrelevant sites with pages full of irrelevant ads and text 1000 pixels below the fold.
Regarding the scrappers, I've tried a couple of things:
- blocking their IPs is effective only for those who take content from RSS feeds.
- RSS delay doesn't help, I just delays scrapping and high ranking of spammers.
- filling spam reports didn't help, it will maybe help later.
However, my last article wasn't scrapped by anyone even after 18 hours and that's after I had made some changes:
- immediate ping to Pubsubhubub
- added rel=canonical tag to pages.
If this stays that way, I can say that the scrapping problem is solved. I'll report after more analysis if that helped me in SERPs.