homepage Welcome to WebmasterWorld Guest from 54.166.62.226
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 83 message thread spans 3 pages: 83 ( [1] 2 3 > >     
Matt Cutts on duplicate
ownerrim

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 33336 posted 9:57 pm on Mar 1, 2006 (gmt 0)

Found this in an article, but not entirely sure what Cutt's point was. Anyone have a clue?

"Honest site owners often worry about duplicate content when they don't really have to," Google's Cutts said. "There are also people that are a little less conscientious." He also noted that different top level domains, like x.com, x.ca, are not a concern."

 

webdoctor

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 33336 posted 4:44 pm on Mar 2, 2006 (gmt 0)

"Honest site owners often worry about duplicate content when they don't really have to,"

If you happen to put two pages with substantially duplicate content up on a site, in my experience, Google notices this and simply ignores one of them. This is not a "penalty" as such - you just don't get two listings in the SERPS.

Seems fair to me :-)

xbase234

10+ Year Member



 
Msg#: 33336 posted 5:46 pm on Mar 2, 2006 (gmt 0)

I attended the Duplicate Content panel at SES that featured Matt Cutts, and the one thing certain about the issue is that no one is certain about the best approach for a webmaster.

It is outstanding of Google and Matt to meet with webmasters in this manner. But the advice on the panel was so contradictory that I found it hard to walk away with any actionable information.

For example, Matt said that many honest webmasters shouldn't be worried, which I would like to agree with and I generally have taken this approach in the past. But many of my honest clients might have their site unknowingly penalized for the otherwise legitimate purposes of hosting duplicate content, purposes that benefit the user experience. When even subtle penalties will occur, sites will not perform as well as they should (Matt has referred to it as having your "site catch a cold").

But it was also said that in other cases where duplicate content was hosted on an otherwise honest site, it would be a good idea to remove the content, which inferred that a penalty might still be assessed.

The penalties for egregious dupe content abuse are obvious (how about an outright ban), but it's this subtle penalty without a trial that bothers the hell out of me. It is a major flaw in the algorithm when a quality, honest site gets dinged in favor of less relevant results due to incorrectly perceived duplicate content penalties.

If these penalties could happen to your website on an automated basis, then honest webmasters should be very concerned. And I've got to tell you, it was an overflow crowd for this panel; perhaps the most attended of of any panel I've ever seen in the 7-8 SES conferences I've attended. The engines should take this as a sign that webmasters are very confused about this concept, and want to do the right thing, but don't know how.

Matt took a poll and asked the audience if they would like to exlude robots from portions of the text on a page (for example, a header paragraph that is repeated on multiple pages), and I raised my hand along with many others.

This would be great to help Google better determine where dupe content resides, but it would also help me sleep better at night knowing that I was using a defined recommended practice, rather than just guessing if I'm being honest, or wondering if my site caught a cold. I really hope this is implemented so it help honest webmasters out, rather than keep everyone guessing.

It is incredible of Matt and Google to address site owners head on. I would just like to see more clarity in the definitions of duplicate content, and the definitions of penalties.

King of all Sales

5+ Year Member



 
Msg#: 33336 posted 6:10 pm on Mar 2, 2006 (gmt 0)

We have an ecomm site and a couple of our items have 50 or more variations. The items are exactly the same in every respect except for the name of the item and the color.That means that the product description and details are the same for every one of them.

There is no way to change the description, etc. for each page without making the site look like a cobbled-together amateurish mess.

Google needs to stop screwing around and get with the program. This is how things work and they should figure out how to adapt - that is if they are serious about their own advice to webmasters to just make the site the best it can be for the end user.

ap_Rhys

10+ Year Member



 
Msg#: 33336 posted 6:12 pm on Mar 2, 2006 (gmt 0)

That's an excellent and very useful post xbase234. I read the same article as ownerrim did (referenced on Matt Cutt's blog) and I found it confusing and contradictory. But it seems that the article was an accurate transcript of what was said.

Difficult to know what to do about (one's own) duplicate content on two or more international domains when the Search Engine 'experts' seem confused themselves.

tomapple

10+ Year Member



 
Msg#: 33336 posted 6:45 pm on Mar 2, 2006 (gmt 0)

King of All Sales...

To carry this further as regard to product pages, this can also be across different web sites.

What I mean is if there are several thousand online resellers of brand X's products, there are going to be an awful lot of pretty similar pages. You can rewrite the manufacturers copy, but there are only so many ways to describe the #XYZ blue widget and what it does.

Pico_Train

5+ Year Member



 
Msg#: 33336 posted 6:49 pm on Mar 2, 2006 (gmt 0)

Don't forget your good old highlight, Ctrl+C and Ctrl+V webmasters out there. These guys are killing loads of us and the only bit of knowledge we have is that the older site will get the credit.

Yeah, that's good. get and old domain, find a newer site with good content, copy, paste. Boom, bye bye competitor.

Great stuff.

europeforvisitors



 
Msg#: 33336 posted 6:55 pm on Mar 2, 2006 (gmt 0)

What I mean is if there are several thousand online resellers of brand X's products, there are going to be an awful lot of pretty similar pages. You can rewrite the manufacturers copy, but there are only so many ways to describe the #XYZ blue widget and what it does.

OK, but try to look at it from a user's point of view: Why should those thousands of mostly similar pages clutter up the search results?

USCountytrader

10+ Year Member



 
Msg#: 33336 posted 6:57 pm on Mar 2, 2006 (gmt 0)

About 2 weeks ago I watched my indexed pages drop from 350k to 18k. After looking, I discovered that because I was tracking clicks into my detail pages so that when the hit return it would go back to the summary no matter how deep they were, That I was creating a huge dup content problem within my own site. Dumb mistake for sure. I fixed it so that if a click tracking url was diplayed it would now use the following meta.

<META NAME="Googlebot" CONTENT="nofollow">

I also realized i had a problem with www and non www. So i put a 301 in for this. By the way, the information here is by far the best.

Well today, im now down to 619 pages from 350k. When there should be around 90k, all original content. A loss of 80% of my traffic. But i am still rank 5, and according to rustybrick, i should be a 6 soon.

The question is, How long will it take for Google to realize that it has been fixed and reindex me? I see both google bots crawling around 3-4k pages a day. But im still sinking fast. 350k pages - 619 pages in 2 weeks. ouch

Or should i just rename my detail pages, would it be quicker? Or just stick it out? Any information would be greatly appreciated.

Pico_Train

5+ Year Member



 
Msg#: 33336 posted 6:57 pm on Mar 2, 2006 (gmt 0)

I forgot to add, good post XBASE, right on the money.

EFV, always have a good point of view. Wise man.

Phil_S

10+ Year Member



 
Msg#: 33336 posted 8:24 pm on Mar 2, 2006 (gmt 0)

Next question?

Just say your site has "caught a cold"

Is there any chance of getting better. Or will this "cold" last forever?

andrea99



 
Msg#: 33336 posted 10:41 pm on Mar 2, 2006 (gmt 0)

Why should those thousands of mostly similar pages clutter up the search results?

Because they compete on price, service, delivery, etc. Giving one site a monopoly is an invitation to gouge. Stifling competition hurts everybody (except the monopolist).

texasville

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 33336 posted 11:10 pm on Mar 2, 2006 (gmt 0)

it's unbelievable that google's algo is so sensitive that you would have to ban it with robots txt just from duplicate header etc. on each page. What happened to uniformity in web design? Cr@p..pretty soon you will have to design unique pages and make your visitor have to check his address bar just to see if he's clicked out of your site each time he changes pages.

europeforvisitors



 
Msg#: 33336 posted 6:44 am on Mar 3, 2006 (gmt 0)

Because they compete on price, service, delivery, etc.

Maybe they need to compete with what's on their pages, too, if they want to do well in Google. For better or worse, the main Google index has a narrow mission: providing relevant, clean, easily accessible results for keyword or keyphrase searches.

phochief

5+ Year Member



 
Msg#: 33336 posted 1:35 am on Mar 4, 2006 (gmt 0)

Google: "If you want to rank well on our search engine, you'd better follow our rules explicitly!"

New Webmaster: "Your Greatness, what are the rules?"

Google: "We're not telling you."

<;^)

ownerrim

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 33336 posted 1:34 am on Mar 5, 2006 (gmt 0)

I was really wondering what Cutts meant by this part:

"He also noted that different top level domains, like x.com, x.ca, are not a concern."

walkman



 
Msg#: 33336 posted 1:45 am on Mar 5, 2006 (gmt 0)

>> I was really wondering what Cutts meant by this part:
"He also noted that different top level domains, like x.com, x.ca, are not a concern."

the way I see is that you can have me.com, me.net, me.org, all pointing at the same folder and Google will solve it.

However, I would NOT try this at home :). Even IF Google's algo solves this perfectly for all, MSN, Ask and Y! might not.

dodger

10+ Year Member



 
Msg#: 33336 posted 2:09 am on Mar 5, 2006 (gmt 0)

Do you mean you can have me.com me.net and me.org all exactly the same , mirrors, and google says it's ok?

ownerrim

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 33336 posted 2:45 am on Mar 5, 2006 (gmt 0)

"the way I see is that you can have me.com, me.net, me.org, all pointing at the same folder and Google will solve it."

One site, howstuffworks aka howthingswork seems to do this with no problems.

King of all Sales

5+ Year Member



 
Msg#: 33336 posted 3:06 am on Mar 5, 2006 (gmt 0)

europeforvisitors-

Google index has a narrow mission: providing relevant, clean, easily accessible results for keyword or keyphrase searches.

Sounds like you lifted that right out of the Google employee manual.

You miss the point, however. There is no correlation between quality and rank. From where I stand, the poorest quality sites have found their way to the top of Google serps.

dodger

10+ Year Member



 
Msg#: 33336 posted 3:07 am on Mar 5, 2006 (gmt 0)

Well yipeee, I'll just put my site on 15 domains and away she goes, all indexed and the traffic goes through the roof, are you serious? can you do this?

andrea99



 
Msg#: 33336 posted 3:42 am on Mar 5, 2006 (gmt 0)

europeforvisitors wrote:
...narrow mission: providing relevant, clean, easily accessible results...

When one is shopping price, service, delivery, etc. competing sites are not just relevant, they are essential.

europeforvisitors



 
Msg#: 33336 posted 4:10 am on Mar 5, 2006 (gmt 0)

When one is shopping price, service, delivery, etc. competing sites are not just relevant, they are essential.

Sure, but Google Search doesn't index those attributes. It doesn't even pretend to index them. Why expect Google Search to be something that it isn't and has never pretended to be?

Google Search indexes text. That's its job. If you want to be sure of having your pages indexed in Google, there's a simple solution: Hire a writer. (Why is it that businesses will spend time and money on programming and SEO but think they can get by with boilerplate product copy?)

andrea99



 
Msg#: 33336 posted 5:15 am on Mar 5, 2006 (gmt 0)

Sure, but Google Search doesn't index those attributes.

Which makes it a rather dimwitted place for comparison shopping. And "those attributes" BTW are given in text making your argument ridiculous.

Whether it indexes to my preferences is not the issue. The issue is whether Google is providing a good service for searchers...
According to you, it shouldn't .

canthavejust1

10+ Year Member



 
Msg#: 33336 posted 5:36 am on Mar 5, 2006 (gmt 0)

Another issue to consider:

MySite.com
MySite.net
MySight.com
MySight.net
MyCite.com
MyCite.net

One pointing to the server, the other 5 forwarded to the first.
Only the main URL would be submitted to G, M and Y!

The intent would not be to spam, the intent would be to catch the traffic from those who can't remember the name or spelling of your site. A lot of traffic has been lost by companies that didn't make those provisions for misspellings or typos.

dodger

10+ Year Member



 
Msg#: 33336 posted 6:17 am on Mar 5, 2006 (gmt 0)

All but the main url should be permanent redirects then only the main url is indexed.

europeforvisitors



 
Msg#: 33336 posted 6:28 am on Mar 5, 2006 (gmt 0)

Which makes it a rather dimwitted place for comparison shopping.

Since when has Google Search pretended to be a comparison-shopping engine?

andrea99



 
Msg#: 33336 posted 6:42 am on Mar 5, 2006 (gmt 0)

Since when has Google Search pretended to be a comparison-shopping engine?

You will rest on silly technicalities like this while denying that Google's mission is to be useful.

I don't expect you to get it....

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 33336 posted 7:26 am on Mar 5, 2006 (gmt 0)

Google is friendly to users, but expecting it to bake bread, or mow the lawn is silly. It's a search engine, not a mind reading tool.

andrea99



 
Msg#: 33336 posted 8:19 am on Mar 5, 2006 (gmt 0)

Google is friendly to users, but expecting it to bake bread, or mow the lawn is silly. It's a search engine, not a mind reading tool.

That isn't even remotely what I was saying. Intelligent conversation is impossible if you don't read carefully.

My point was (and still is) that there is great utility in providing similar, nearly duplicate, entries in the search results.

You are contending that Google is too stupid to do this properly. You may be right but I don't think that's your point. Search engines are still quite primitive and the more I examine them the more I realize that the promise is very far from reality.

Comparison shopping cuts across all lines, you can "shop" for informational sites, indeed "shop" for anything. Comparison is what search is all about. A complaint about repetitive listings in the serps is just a complaint that search engines aren't smart enough to read your mind. This sarcasm about baking bread is an insult and betrays a lack of understanding of this...

This 83 message thread spans 3 pages: 83 ( [1] 2 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved