Forum Moderators: open
But other search engine experts say that occupying multiple slots in search rankings may simply be smart marketing. Greg Boser [webmasterworld.com], the founder of WebGuerrilla.com [webguerrilla.com], a search-engine marketing consultancy, likened the Gift Services sites to GMC and Chevrolet. "They have different logos and different TV commercials," he said, "but a Chevy truck is exactly the same thing as a GMC truck."
Although, I recently learned that on per page basis, pages with duplicate content would rank low, but I certainly don't think that the whole site would be penalized (Google ranks by page anyway) unless it was blatant robbery which would probably have to be determined by a human.
If it makes sense for your customers, do it. :D
Just show one copy. the highest ranking one. Jsut like position 1, onlyone site can be there for hte keyword and it'll change regularly.
Visitors donT' bebenfit by findinghte same story 100 times, even if it'sReuters, that has nothing todo with it.
Only one copy should be shown. No need to "ban" and all that crap.
SN
Just show one copy. the highest ranking one. Jsut like position 1, onlyone site can be there for hte keyword and it'll change regularly.
Interesting idea, but that wouldn't be like allowing only one news channel? Why list only one? The 'best' result will be on top anyway, so why not show the whole list?
I may have similiar content to someone else, but maybe people like my layout better or maybe they like the tools I offer, or my joke of the day.. bottom line is, maybe they would rather read the reuters story on MY site, then on the other guy's site
It's a package deal, everybody tries to gain a competitive advantage somehow.
Who's to say what value the visitor will derive? No sense censoring relevant search results, you may be robbing the visitor or a really good site that they might like.
Important caveat here is duplicate content that is clearly intended to artifically inflate rankings, or is purely deceptive. Of course, Google then has to judge intent and that will probably be done manually :)
We should have a Google Jurisprudence and Economic Policy category. ;)
1. Spend two months complaining to Google about how unfair it is.or
2. Spend two weeks getting a solid understanding of why those techniques work and then develop a strategy to get back in the game.
If I was a local gift basket company, I would have spent my time contacting all the other local independent gift basket companies and ask them if they would be interested in starting an online association.
If I was a local gift basket company, I would have spent my time contacting all the other local independent gift basket companies and ask them if they would be interested in starting an online association.
If mine was one of the companies suffering because of duplicate sites, I'd have been more inclined to DHL the one responsible a horse's head.
Creating duplicate sites in order to dominate the SERPS and steal business from others also results in the theft of the users right to choose. Alternatively, you could say that you are stealing a part of each and every user's life who has to spend longer looking for choice than they otherwise would. People who create duplicate domains do so to line their own pockets at the expense of others. Just like email spammers, these people are scumbags and vermin.
In sport, in business, and in life there are rules that have to be obeyed. This is not an issue of freedom of expression (or any other constitional rights in the US or elsewhere). Does anyone here believe the internet would be better for users if everyone created ten identical versions of their websites? No, of course not. So why do some individuals believe that they have the right to do so? And why do some members of WW believe that this practice is acceptable?
Duplicating the odd page of information (with permission) because you think your visitors will be interested, amused or whatever is fine provided the motivation is not to improve your SERPS. One way to prove that motivation is to disallow search engines from those pages. Using other peoples original work simply to attract visitors through search engine manipulation is not fine in my opinion.
Kaled.
and steal business from others
I beg to differ with this statement. It's not stealing business from another. It's businesses in competition. The business that win call it healthy competition. The ones that lose complain about it being stealing, whine about it and report it to google.
i believe the actual solution is to eliminate the concept of "being high in the SERPS" entirely. Why not show a different mix of results for each search? Why does the search engine assume that the visitors want the exact same result every time they search on a term? It would be far more valuable to me to get a variation in results each time.
Well over the last week or so g seems to to be moving closer to that..)
They could easily have a 'seo proof' search engine if they wanted to, just change the algo randomly every month, week, whatever.
It ultimately just depends on how much stability the searcher wants in their results.
Using multiple sites to dominate keyword x, y or z is no different than using the budget of your multinational to muscle out the small business local competition.
Yup, you can argue there as well that it's "just business" - and I'm pretty sure the Monopolies and Mergers Commission would listen to you for all of, oh, 2 minutes before they slapped a fine on you.
The only reason that dominating the SERPs is an accepted practice right now is that it's an emerging market and frankly, there arent enough people with power (read: power and knowledge) to be able to even begin to regulate it. The inherent freedom of speech issues, lack of geographical boundaries, lack of international agreement and associated privacy concerns make it a minefield for any governement to tackle.
IMO, the search engines themselves should take steps to self regulate themselves before goverments do impose restrictions.
In the UK and US we are seeing a lot of anti-spam legislation hitting the public now - it wont be too long before SE issues are tackled.
Scott
let's say the seprs are like this:
1. site A
2. Site B
3. Site C
And I own site A... I buy Site B and Site C... should htey now disapear form the serps? Especially when htey were similr in product lookj and SEO before I bought them, whne, I think we all agree, it wasperfectly legit.
SN
Why not show a different mix of results for each search? Why does the search engine assume that the visitors want the exact same result every time they search on a term? It would be far more valuable to me to get a variation in results each time.
I suspect that many users would find such behavior annoying. (I know I would.) Also, it defeats the whole purpose of a search engine, which is to deliver the most relevant search results--not a random selection of somewhat relevant search results.
Cheating the sytem is one thing, but having three identical results (different domains) in the top 10 is hardly beneficial to users. In the case discussed in the NYT article, apparently there were ten or more near identical sites in the top SERPS. That cannot possibly be beneficial to users.
If Google wish to produce the best search results, they must eliminate the duplicates. I would also like to see a reduction in single-site duplicates. I have Google set up to show 30 results per page - why Google believe that I want to see more (near) duplicates than if I have selected 10 results per page baffles me. Absolutely crazy!
Off-topic : Google really should look again at the bias given to keywords in urls. It's getting very silly. Eventually, they will have to reduce the bias to no more than title/body text and it can't come a moment too soon for me (both as a user and as a webmaster). If Google want webmasters to play fair and create good content, they should not open spammming doorways like excess bias to urls. And they should close down other doorways like duplicate domains.
Kaled.