More and more I see smart aggressive webmasters generating duplicate sites/or virtual duplicates to try and attain multiple positions within various serps of the engines.
I don't see the engines doing anything about this.
In a variety of topics, including one I compete in, it is significantly effective to load up several sites representing the same business with virtual 75-95% duplicate content and different urls.
Within various categories it makes for effective marketing/marketing.
From the customer perspective in some cases it reinforces the business. In some cases it could turn off the customer...but I'll bet it works dramatically well with most prospective customers.
While the engines say this is inappropriate, they don't police this at all. I know I've reported it a variety of times with no response or impact.
I'm seeing it more and more
And my reaction finally is to do the same.
As a webmaster and business person I'm very familiar with the benefits. It appears the risks are a big fat 0.
I can't but think that the more webmasters that do this the more junk on the web.
But I believe its the engines' responsibility to enforce this...and to date I just don't see a response.
Just wondering what you see or experience in this realm.