No, what is being said is while it is useful to mow the lawn, or do any number of things that have "utility, that isn't the mission of the Google search engine.
While some people might like seeing 100 duplicate product pages just to find the one that charges a nickel less shipping, that's not the goal of the engine, and hardly anyone would want it to be, although some people would like that. That is just not their business.
|Comparison is what search is all about. |
Who on earth told you that? Have you ever looked at Google's mission statement [google.com]?
|To organize the world's information and make it universally accessible and useful. |
I don't see anything even remotely similar to "organize the world's product information for the purpose of price comparison".
Everybody wants to put words into my mouth, I'm sick of it. If you can't read, don't reply.
Comparison was the essence of search before Google or the internet existed. Google didn't invent search. Search was invented before language. Google is a minor and fleeting player in search. Search will entail comparison when Google is gone. Google has no right to define search.
But Google understands this, my antagonists do not.
Oh BTW, don't take my first statement out of context now without considering the intervening dialog, I am accustomed to being attacked unfairly...
|My point was (and still is) that there is great utility in providing similar, nearly duplicate, entries in the search results. |
Then we totally disagree and my guess is that Google also disagrees with you.
I hope you don't misconstrue my statement as an unfair attack!
Oh ... and BTW, this thread is about duplicate content on Google and Google has every right to define the parameters they choose to employ for their search results!
froogle = comparison shopping
google = general string/topic matching search
so you would'nt expect 100 similar results for blue widgets, and nor would you want them.
conversly, if someone was searching for "blue round widgets price location buy" then you would expect 100 fairly similar pages for the purpose of price comparison...
you guys are both right ;-)
LOL ... yes, except in this thread, we are referring to Google and not Froogle! :)
Yes, a generic search (Google), and a specialized search (Froogle) have essentially the same goal but one is better prepared to deliver specific results.
A good search strategy is a middle ground between too much information and too little because each extreme is useless.
Identical entries are of course useless, but NEARLY identical entries are not. Indeed it is often the subtle differences that are the differences that count the most.
We are indeed talking about Google (with a G) here and every entry is competing with the others on some criterion--that's what this discussion is about.
europeforvisitors posed this question:
And I replied "because they are competing." I used price and delivery as examples which turned out to be a mistake because this was seized upon as some kind of proof that I was talking about a "price comparison engine," when no--I am talking about all the various things upon which web sites compete for attention. Price is indeed among these but price is not always measured in money.
|Why should those thousands of mostly similar pages clutter up the search results? |
Often the price you must pay for poor organization or inferior writing is the criterion upon which a web site is chosen over another.
Google's inability to distinguish among the quality sites and the spammy ones is the very point of this discussion.
I am so sorry if I confused you all by mentioning "price." My comments were not a discussion about Froogle, except to the extent that some mistakenly thought they were.
Search as it relates to duplicate/near duplicate content is not a unified concept in my view. The specificity of the search should play a role in the dregree of similarity that is appropriate. The more general the search the stricter the filter on duplicate content.
|Identical entries are of course useless, but NEARLY identical entries are not. Indeed it is often the subtle differences that are the differences that count the most. |
You still haven't explained why Google users would want to dig through hundreds or even thousands of identical (or nearly identical) pages in search of "subtle differences."
In any case, Google has the right to decide what it displays or doesn't display in its search results--just as businesses have the right to decide whether they want to develop original content or take their chances with boilerplate text. It's that simple.
|You still haven't explained... |
I have, the answer has evidently escaped you. I'll try explaining it again for you in another way, though you may once again ignore this (my patience is wearing thin).
There are very few who will actually sift (or think of sifting) through thousands of results. But if the answer is in that haystack it is *Google's* job to find it. The searcher's skill in choosing search terms also comes into play here, but that is a side issue. When there are thousands of similar results Google does present them, sometimes the result count is in the millions. It is the order of presentation that is at issue here and Google's ability to put them in the proper order seems to be what all the shouting is about. If Google doesn't present all the thousands or millions of *differing* results with all their subtleties it is to the detriment of search quality, the searcher's willingness or ability to peruse them all notwithstanding.
I don't know why you are obsessing on Google's "right" to do this or that, I never questioned their rights. Please read more carefully. I may express opinions about or preferences for various modes of Google behavior but I've never questioned their right to do anything.
|If Google doesn't present all the thousands or millions of *differing* results with all their subtleties it is to the detriment of search quality, the searcher's willingness or ability to peruse them all notwithstanding. |
OK, let's say I'm John Doe and I'm searching on a broad keyphrase like "Widgetco WC-1 digital camera." The possible results include:
- The manufacturer's page(s) for that camera
- 100 reviews of the Widgetco WC-1
- 1,000 retailers' pages, including 800 that use boilerplate descriptive copy
Now, if my purpose in searching was to sift through 800 boilerplate product descriptions, I might be happy to find all of those descriptions in Google's SERPs. But if my goal was to find the manufacturer's page or read reviews of the Widgetco WC-1, or even helpful commercial copy that went beyond boilerplate descriptions, I'd be satisfied with having 800 boilerplate product descriptions in the SERPs only if they were shown after everything else. Why? Because, as a user, I wouldn't want to dig through 800 boilerplate pages to find different kinds of information, analyses, points of view, etc. hidden in the clutter. Users aren't looking for a data dump; they want the search engine to filter and organize information for them.
Maybe listing duplicate content would be acceptable if:
- Users had chosen the option of including duplicate content in their search results; or...
- Pages with significant amounts of duplicate content were displayed after everything else.
Still, it's up to Google to decide how it wants to handle duplicate content, and we need to remember that the interests of Google and its users aren't necessarily identical with our own business agendas.
"You still haven't explained why Google users would want to dig through hundreds or even thousands of identical (or nearly identical) pages in search of "subtle differences." "
I'll try and be brief,
People want to be able to choose, according to you, we don't need so many brick & mortar malls its enough to have one store of a kind in a city...but that is not how this business model work. The same product can be sold in many places, when you shop you want the ability to choose where and how.
Take a test and try to write unique content for 2" nails..(just 100 different descriptions) If you will be able to accomplish that I will be glad to hire you with a fat salary.
And if you refer to non commercial content that means we need only one book about world war 2...Ha?
Granted that you can't invent 100 distinctly different ways of describing 2" nails, to rank in Google, you need to write some short "tips" about the history of nails, some good ways to organize nails to keep them separated from the bolts, pros and cons of using a hammer versus a nail gun, etc.
It's not clear how much of this non duplicative stuff you need on the page, and of course it needs to go somewhere on the page where it won't distract serious buyers, but a bit of creativity may solve the duplicate content problem, make the site more entertaining for users, and help impress the google algorithm.
|...we need to remember that the interests of Google and its users aren't necessarily identical with our own business agendas. |
There you go again. It must be a guilty conscience about your own business agenda, I don't think I've brought mine into the discussion. And I've never argued for the inclusion of duplicate content either.
As it happens I routinely do many searches for widgets in differing colors, sizes and styles. If I have a bias it is in favor of the beleaguered searcher faced with hundreds of phony directories.
Well, anyway you do seem to be agreeing with me that it is the result order that matters more than the inclusion of almost identical entries.
asher02's contribution is welcome and on point. :)
And the award for best polemics in a thread goes to...
uuuummm. haven't decided yet.
|People want to be able to choose, according to you, we don't need so many brick & mortar malls its enough to have one store of a kind in a city... |
Nope, I said nothing of the kind.
|but that is not how this business model work. The same product can be sold in many places, when you shop you want the ability to choose where and how. |
Sure, and to encourage choice, maybe the Yellow Pages should include boilerplate descriptions of every product sold by every store. :-)
It's often the case that the webmaster is unable to dictate content. Take for example an auction website, whereby people list their stuff. I am sure that many people advertsie the same things, but in different locations and for different prices.
BUT the content of their ads may be similar, and Google still lists it. Then again, info sites can have similar content, as is seen by entering a search for certain information.
There are always two sides to every story, and this current forum should appreciate contrasting views.
The SEs are struggling to remain afloat in a sea of low quality pseudo websites that are filled with duplicative and/or regurgitated and/or simulated (nonsensical) content. The duplicate content filters, and the "sandbox" effects are both directly related to this battle.
In Google's efforts to prevent this junk from filling its SERPs, it is trying various different ideas/methods; unfortunately, it seems like most of these ideas/methods have lots of side effects, including inadvertently preventing good quality sites from appearing too far down the SERPs to be seen by most users.
Well established authority sites that ride high in the SERPs are unlikely to be hurt by these side effects; newer, less established sites tend to be adversely affected, and thus the tradeoffs seem more significant.
I would like to know what is the difference between a site that takes a couple of different (freely available) sources, reorganises the content and republish it AND some MFA sites that just "regurgitate" the content..
For example, take a site like zillow, who is having a lot of buzz around it right now...
Are they just regurgitating info, or bringing something useful to the user, even if the data was available before on separate sites....
My point is sometimes the line seems difficult to draw...
|I don't know why you are obsessing on Google's "right" to do this or that, I never questioned their rights. Please read more carefully. I may express opinions about or preferences for various modes of Google behavior but I've never questioned their right to do anything. |
Andrea99 also said:
|Google didn't invent search. Search was invented before language. Google is a minor and fleeting player in search. Search will entail comparison when Google is gone. Google has no right to define search. |
Am I reading something differently than everyone else? "Google has no right to define search."
Is the word "right" in there anywhewre?
[edited by: lawman at 12:23 am (utc) on Mar. 6, 2006]
[edit reason] fixed coding [/edit]
Right on. :-)
"Granted that you can't invent 100 distinctly different ways of describing 2" nails, to rank in Google, you need to write some short "tips" about the history of nails, some good ways to organize nails to keep them separated from the bolts, pros and cons of using a hammer versus a nail gun, etc."
But I thought Google said build your site for surfers not for search engine. you know as well as I do that this nail history stuff does not belong there...
I need to buy nails am I going to read about the history of nails? and if also need some pins..hey what the heck lets read some pins history....I guess the conversion rate will be high...
Now seriously, my point is that there are times where duplicate content can't be avoided. So search engine will need to figure out how they can solve it without eliminating entire industries.
Liane, that's an just an absurd comparison. efv was referring to Google's right to do things with it's own business not its right to play god with the language.
This is so weak that I'm not really going to take you seriously any more. It appears to be a petty group dynamic at work here. It's been amusing to expose you. :)
Natural language is too slippery to resolve along lines of logic but it shouldn't resolve so blatantly along emotional lines. I suspect you won't grasp or admit what I'm saying here.
i'm seeing all of our pages back in the main index doing a site: anyone else seeing this that was affected by the supplemental problem?
What is the IP of the data center in which you are seeing this?
On that data center, I still see supplementals for the sites I am watching.
i'm seeing supplementals for our site on 220.127.116.11
It seems that Google has decided that in case of doubt (duplicate or not) it is better to show the original content first.
Original can be relatively easy deduced e.g. by the date of the file creation and site signature required for the sitemaps.
Look at the Supplemental club- Big Daddy coming thread that is growing in parallel with this.
Many directories now have just one main page at the top.
It has sense for the searches.
The widget directory is goods to compare widgets and to search for widgets.
However if the searcher looks for the specific sort of the widgets it's probably better to show him first the widget producer, i.e. first hand info, and not the directory widget page
|I need to buy nails am I going to read about the history of nails? and if also need some pins..hey what the heck lets read some pins history....I guess the conversion rate will be high... |
A site that sells nails doesn't have to write content about the history of nails; all it needs to do is include some original sell copy.
The argument that a duplicate-content filter will "eliminate whole industries" simply isn't true. Businesses have choices: They can pursue organic search traffic by investing in product copy, they can buy advertising (as businesses do in the offline world), they can create affiliate programs, and so on. They just can't expect search engines to waste users' time with their borrowed boilerplate clutter.
not sure if this statement is completely true because on the site i am watching we have at least 200 pages that have completely unique content on them, whereas the other pages are product pages (the other 10,000 or whatever). So what you guys are saying is for those 200 totally unique pages what would I have to do to get them included in this new update?
| This 83 message thread spans 3 pages: < < 83 ( 1  3 ) > > |