Welcome to WebmasterWorld Guest from 184.108.40.206
@mattcutts Matt Cutts
Scrapers getting you down? Tell us about blog scrapers you see: [goo.gl...] We need datapoints for testing.
Google is testing algorithmic changes for scraper sites (especially blog scrapers). We are asking for examples, and may use data you submit to test and improve our algorithms.
This form does not perform a spam report or notice of copyright infringement. Use [google.com...] to report spam or [google.com...] to report copyright complaints.
Exact query that shows a scraping problem, such as a scraper outranking the original page:
[edited by: Brett_Tabke at 1:11 pm (utc) on Aug 27, 2011]
Many newspapers print the same stories because they have syndication agreements with the news agencies
This is quite worrying. Google doesn't have the mindpower to deal effectively with scraping so now it is, in effect, socialising the problem by getting the public and users to submit the details of scrapers.
It is a positive development in that it will solve a percentage of the problem however until Google manages to automate the process of detection, analysis and removal, it is still going to have a massive problem.
[edited by: rlange at 2:36 pm (utc) on Aug 29, 2011]
Does Google take everyone down, even those like me who have permissions?
The guy was just spinning other people's content and his whole network got nailed.
Scraping is nonsense, Google IS the biggest scraper going - every day it sends out its bot and takes/collects extracts of web pages without permission, been doing it for years and nobody cares as they get free exposure.
Maybe Google should use the DMCAs submitted to them as data for the algo. DMCAs are reviewed manually by Google people and should be trustworthy data.
The fact that webmasters are given a simple form, when in fact the algorithm should have long addressed content ownership, is rather deflating.
[edited by: Robert_Charlton at 6:49 am (utc) on Aug 31, 2011]
[edit reason] removed specific [/edit]