Welcome to WebmasterWorld Guest from 22.214.171.124
We certainly have it applied to us - dropping us from 50k daily uniques to .... 2k overnight after Feb 2 update.
Most of the "duplicate content" appears to be from sites that are actually posting info and snips from .... our site!
Google keeps telling us "you have no penalty" and I finally realized that they don't consider a filter a penalty. Pretty much same effect though.
Marval - any idea if anybody has had it removed? I do not think the filter is a manual process so removal would need to be from changes to the site.
Now that Adsense is so lucrative I think that in the past year or so *hundreds of thousands* of large and small scraper sites have come online. Some are in normal formats (such as directories that post other's content) and automation is never going to detect them as "spam". In fact it's subjective. Our big site has a combination of original content, database stuff, public domain, etc. Few large sites are 100% original, so it must be very hard for the algo.
That said, they should at least find a better site review process - the current emails we get telling us we should just wait for things to shake out, are painfully unhelpful.
he <filter> parameter causes Google to filter out some of the results for a given search. This is done to enhance the user experience on Google.com, but for your application, you may prefer to turn filtering off in order to get the full set of search results.
When enabled, filtering takes the following actions:
* Near-Duplicate Content Filter = If multiple search results contain identical titles and snippets, then only one of the documents is returned.
* Host Crowding = If multiple results come from the same Web host, then only the first two are returned.