Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Duplicate Description and Title Notices in WMT - do they affect ranking?

         

helpnow

4:55 pm on Jul 5, 2008 (gmt 0)

10+ Year Member



GWT Tools now contains Content Analysis. This is a great new tool. For large sites, some of the information there may have been difficult / impossible to discover previously through any other means.

For example, duplicate meta descriptions and duplicate title tags.

Is there a definite answer yet on this?

Do duplicate meta descriptions affect your site's overall ranking?

Do duplicate title tags affect your site's overall ranking?

The Google help topic is inconclusive - it says it will not affect your being indexed, but it does say it may increase your ranking.

We're fixing our site anyways, but I would like to know everyone else's experiences with this...

How important are these tags and meta descs. for ranking?

Receptional Andy

5:20 pm on Jul 5, 2008 (gmt 0)



Is there a definite answer yet on this?

The definitive answer is that it might affect ranking of both individual pages and websites. Whether and to what degree depends on an individual site, and the level of duplication.

Duplicate titles are a definite no-no. A title should be the definitive "makes sense out of context" guide to the page's content. If you don't have a unique, and well-crafted title on a page, its ranking will definitely be affected. If a page is not sufficiently unique to warrant a unique title, exclude it from search results.

Duplicate meta descriptions are less of a problem, but writing a snappy description should be a part of your content creation process. Think of the hours spent crafting text for paid search listings - you get even more text in a snippet, and a bigger audience.

I don't think it's an exaggeration to say that the title and meta description are the two most significant individual tags for on-page search engine optimisation.

Whether a bunch of duplicates can harm an entire site is a less clear factor: it seems to harm some sites a whole lot, and other sites not at all. But if you have a site with a whole lot of pages that have slim to no chance of performing well in search results, a fix is probably overdue ;)

youfoundjake

5:22 pm on Jul 5, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Having unique meta descriptions and title tags on a site is one of the basic code designs I try to go for now.
The meta descriptions tag has no bearing on how high the page appears in the index, but it does greatly increase the click-through from the SERPS.
Throughout the WebmasterWorld forums, there have been discussions about the Title structure.

Here is the best one I found:
[webmasterworld.com...]

As far as meta descriptions, check out what the gurus said here:
[webmasterworld.com...]

And of course if you haven't seen it yet, Brett's infamous post:
[webmasterworld.com...]

[edited by: Robert_Charlton at 6:02 pm (utc) on July 5, 2008]
[edit reason] fixed url [/edit]

Robert Charlton

6:32 pm on Jul 5, 2008 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Here's another on the title element... part II of the Building the Perfect Page series (part I is cited above)...

Building the Perfect Page - Part II - The Basics
Developing an effective <title> element.
[webmasterworld.com...]

Whitey

1:41 am on Jul 6, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



For example, duplicate meta descriptions and duplicate title tags.

Is there a definite answer yet on this?

Do duplicate meta descriptions affect your site's overall ranking?

Do duplicate title tags affect your site's overall ranking?

Duplicate content will effect your rankings.

However, technically it is not the same thing as a penalty. But i suspect that Google may use it as part of a penalty calculation.

The effects of multiple same page content are to dilute the the value of the overall site pages and perhaps drag it below a threshold of "trust"tolerance. My suspicion is that this may be a contributor to the various minus penalty levels.

Total duplication may contribute to -950 , partial duplication may contribute to -30 / 40 / 50 etc or indeed be the sole element in some cases.

Higher PR sites may escape the problem for a while until their "trust" is reviewed. This might ocur when a site is re calculated for "trust " purposes.

Duplicate pages may be cause by a number of factors and depending on the size of the site may become quite complex to manage.

Duplicate content can be horizontal [ between linked sites ] and vertical [ between internal pages ]. From this , terms like "Thin Affiliates" has come up, yet internal duplication may be even more damaging.

A review of Hot Topics with regards to Duplicate Content may help to understand the various facets [webmasterworld.com...]

In my view, the management of content and architecture is one of the hardest things for a site owner to be confronted with , as there is insufficienct expertise and tools consolidated into one place to handle it.

WMT is a good start.

However, I do not think it is clear enough, and duplicate content seems to be underlying and at the core of a lot of discussions, both here and elsewhere.

I do wish Google would assist in improving the notifications and reports associated with this as i believe it would help a lot. 99.99% of site owners probably don't read forums and Google needs good content.

Improving WMT to a point where it becomes a definitive QA interface, validating site owners [ so abuse is contained ] is something that can only be good IMO.

[edited by: Whitey at 1:42 am (utc) on July 6, 2008]

helpnow

2:01 am on Jul 6, 2008 (gmt 0)

10+ Year Member



Hi Whitey!

I know duplicate CONTENT is an issue. Within a site, or between sites.

But how much of an issue are duplicate title tags or meta descriptions? Not really content per se, as they don't display on the page... Can these sorts of duplication cause a penalty in ranking?

"Higher PR sites may escape the problem for a while until their "trust" is reviewed. This might ocur when a site is re calculated for "trust " purposes." -> Forgive me, but is this fact, or conjecture? If fact, how often might a site be reviewed for trust?

And with respect to -950, or -40 etc., are these site-wide? If I had 100,000 pages in the index, and then I get hit with one of these penalties, does that then mean that EVERY page of mine will be -950, for example? Or just the errant pages?

g1smd

2:11 am on Jul 6, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



There is plenty of evidence that duplicate titles and meta descriptions are not a good idea. They are still a part of the page content.

There are plenty of clues about that in a simple site:domain.com search. Run 100 results per page to be more clear.

helpnow

2:31 am on Jul 6, 2008 (gmt 0)

10+ Year Member



Hi g1smd! It's been some time since we last interacted...

What you say about the site:domain.com search, set to 100 per page, is an enlightening view for this discussion. I never loked at that search that way before. It does become apparent very quickly how dupes of tags can become useless fast.

And that is what google says too - tag duplicates are not great for the user... But google never comes right out and says, "and if you have duplicate tags, we'll lower your rankings too!"

I think the message from everyone here is that your rankings will be hurt with duplicate tags.

I was confused on this point because I have not touched my tags for years. And I have been stable and doing well for the past 12 months.

I only became acutely aware of just how bad my tags were, and how similiar they were to each other when I saw the Content Analysis in GWT... And by coincidence, the same day I saw it, June 4, my rankings suffered across the board.

Hence, because of how long I successfully got away with duplicate tags, my wondering if my duplicate tags are the likely cause of my current ranking decreases, or if there is something else going on with google right now that has nothing to do with me, even though it is affecting me...

Over here...
[webmasterworld.com...]
... confuscius said "Google suffered a major data loss and traffic loss is returning as pages become re-indexed and link relationship calculations go round the loop. 17 days to some recovery which matches my previous experiments on time stamped non supplemental pages of Google's indexing cycle."

I feel that the SERPs are shifting a lot right now, so, I am still in the middle of fixing all of my duplicate tags (and I will keep going until they are all gone) in the hope that this is my problem and I am in control of my destiny ; ), but meanwhile, I cannot help wondering if my current ranking loss is actually just a google thing that will be fixed independent of whatever positive things I may do... ; )

g1smd

2:48 am on Jul 6, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The biggest clue in the site search is the In order to show you the most relevant results, we have omitted some entries very similar to the 47 already displayed. If you like, you can repeat the search with the omitted results included. message and how the results count changes after you click it.

This thread is also illuminating... [threadwatch.org...] in comment #comment-43362 onwards.

Moderator note: While we generally don't allow links to outside discussions, Matt Cutts' comments on this referenced Threadwatch thread can be considered authoritative. Matt confirms an observation made by g1smd on the thread that Google did at the time collapse site:domain results because of duplicate meta description tags. Matt seems to make it clear that he is talking about site: operator searches only.

[edited by: Robert_Charlton at 8:12 am (utc) on July 7, 2008]

helpnow

2:51 am on Jul 6, 2008 (gmt 0)

10+ Year Member



Hmm... I have 358,000 pages indexed. When I get to page 10 of 100 per page, there is no page 11, and there is no omitted results message. <LOL> Should I be nervous!? ; )

Robert Charlton

6:22 am on Jul 6, 2008 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



But how much of an issue are duplicate title tags or meta descriptions?

When Google labelled Supplementals, I saw pages with duplicate titles from the same domain put in the Supplemental index, even though those pages had vastly different content. A client had all of its archived press releases titled "CompanyName - Press Release". They were all Supplemental, and they essentially didn't rank for anything. As soon as they changed the page titles to the press release headlines, the pages became extremely productive.

There's been some disagreement in WebmasterWorld discussions about meta descriptions. A great many here who run large sites with templated content that doesn't vary much from page to page feel that individualized meta descriptions can keep the pages in the index... and that no meta description is preferable to an identical meta description on every page.

On pages with highly differentiated content and titles, I've found that identical meta descriptions don't seem to affect indexing or ranking. That said, when I can (which is on smaller sites), I tune meta descriptions carefully to bring up snippets that will help with click throughs, tuning for phrases for which each page is most likely to rank.

On large dynamic sites, I've preferred to go with Google's snippets rather than with lame dynamic descriptions that clients often come up with. It very much depends on the type of content that's likely to be found on the page.

Whitey

7:13 am on Jul 6, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



But how much of an issue are duplicate title tags or meta descriptions? Not really content per se, as they don't display on the page... Can these sorts of duplication cause a penalty in ranking?

All i can do is echo what's above.

How much - maybe -950+ , - 30/40/50/60 + sitewide - I don't know for sure, but I'm highly suspicious it is a contributor.

Again, managing duplicate content is one of the most major tasks of web site technical administration.

helpnow

1:28 pm on Jul 6, 2008 (gmt 0)

10+ Year Member



Robert - "There's been some disagreement in WebmasterWorld discussions about meta descriptions" <LOL> Yes, there sure has been! This is what makes it difficult in trying to pin down the importance, and the effect... And Google is coy on the topic too, which I understand - to completely disclose would maybe reveal too much about the algo. Curious what you say about using NO tags - and just leave it to Google to provide one via a snippet - I simply do not have the nerve to put THAT fix into place and see what happens. ; ) I wouldn't test it, but I can see how it might be beter than a duplicate tag... And I see your point on small vs large sites - this probably accounts for some of the disagreements - a comparison of apples to oranges in some cases. (It'd be cool if we could define ourselves a bit in our account, and then in the left column, it would say manages a big site, a small site, geo-location, etc. - it might put our posts into perspective for others...)

Much to think about...

Robert Charlton

7:59 am on Jul 7, 2008 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I've added some moderator comments in g1smd's post above about the Matt Cutts comments on the TW thread, confirming g1smd's observation about the site:domain operator at the time.

(The link on that thread to another post by Matt gives a 404, another reason we try to keep our discussions self-contained here... so they'll be useful over time.)

Note that the site:domain operator is a reporting function. I think Matt made that pretty clear in this comment...

I'll ask someone if they can change the default behavior for site: queries

If it were a ranking function, he wouldn't have been asking for that kind of change. The thread context suggests that the site:domain operator function behavior was subsequently changed (and that Matt had commented on that in the thread that gives the 404. Maybe g1smd can fill us in on what he said).

Curious what you say about using NO tags - and just leave it to Google to provide one via a snippet - I simply do not have the nerve to put THAT fix into place and see what happens. ; ) I wouldn't test it, but I can see how it might be beter than a duplicate tag...

You can check and see that Matt doesn't have any meta descriptions on his blog. I've heard him say elsewhere that for large sites he feels that the meta description tag is 'not worth obsessing about'... that 'you've got to prioritize'... and he's often very happy with the Google snippet. I've never heard him comment on unique descriptions vs duplicate descriptions, though. It's fairly clear that he thinks no description is OK.

[edited by: Robert_Charlton at 8:19 am (utc) on July 7, 2008]

g1smd

11:34 am on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



You can find the article referenced at the very end of that TW thread if take that link and cut the /archives/2006/10/06/ part out of the URL, and then replace the final "slash" with ".html" instead.

"Cool URIs don't change". Grrrrrrr.

Marcia

2:45 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think it's been made pretty clear that meta descriptions (more than likely) don't affect rankings, but I haven't seen that kind of clarity expressed about the page title element, even with regard to duplication. Have I missed it, possibly?

My understanding is that snippet generation is a different "team" and process, with different algos, and is a function that's pretty much separate from scoring, though there's ample evidence in papers and patents that "filtering" can take place at various stages of processing.

I don't have any doubt at all that duplication across page titles DOES have an effect on rankings, and some recent/current experience with cleaning up what's probably one of the worst cases imaginable of MASSIVE title and description duplication, not on one but across two domains, is reinforcing my belief that duplicate titles are a factor that can't be overlooked.

g1smd

2:56 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The page title affects ranking for that page, that much is very clear.

The meta description shows in the snippet, but doesn't have a great bearing (if any) on ranking.

Where duplication of these elements is found, it affects results clustering in the SERPs, and the generation of the "In order to show you the most relevant results, we have omitted some entries very similar to the 85 already displayed. If you like, you can repeat the search with the omitted results included." message.

I like how sometimes you can have some number of results listed, followed by that message, and then, when you click the "omitted results" link the number of results goes down. It is sometimes illuminating to see which pages disappeared.

Quadrille

4:55 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I agree with most of what's been offered here; there's no doubt that duplicate titles are Very Bad News, and no doubt that duplicate meta descriptions can lead to pages being (the equivalent of ) supplemental.

We need a shorthand term for G1's "In order to show you the most relevant results, We Have Omitted Some Entries very similar to the XX already displayed. If you like, you can repeat the search with the omitted results included." message. Maybe "WHOSE Omitted Pages? :)

In my experience, the problem self-resolves once the meta descriptions are sorted, suggesting this is a filter, and NOT a penalty. Also, while one of my sites had this problem extensively (I know, I know - pure laziness!), at no time did site stats suggest that other pages were significantly affected, if at all.

I believe this issue has become a major issue over the past year or so, as my pages got whacked suddenly a little over a year ago (I think).

Marcia

5:47 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>We need a shorthand term

Quadrille, I call it simply the "similarity filter" since I can never remember all the text of that whole long sentence.

I believe that's just filtering for the query results/snippet algos, to prevent duplicates from annoying users and skewing results quality.

But: I'm still convinced that the TITLE part can negatively affect pages that should rank for what's supposed to for the keyword phrase in question, simply because duplications in titles send mixed signals to the scoring algos as to which page on the site is actually the best, most relevant for the phrase.

Added:

Incidentally, it isn't only the title that'll create confusion & duplication woes and cause pages on a site to *not* rank, but I don't think we'll ever see mention of the others in WMT. Snippets they'll tell because it impacts their search results quality perception and doesn't reveal much more, but scoring factors they'll never tell beyond cryptic, between_the_lines hints you have to translate from Matt_Speak.

In fact, I believe I'm seeing what's actually a penalty - a drop of about 30 or so places down to #70 for a page on its most desired search term - because of duplication in another on-page factor and dilution in yet another. I know what's causing it, but the site owner will balk about changing it.

But I digress, those other on-page & sitewide factors are a subject for a whole other thread.

[edited by: Marcia at 6:03 pm (utc) on July 7, 2008]

g1smd

5:57 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



*** we need a shorthand term ***

Here it is: "Iotsytmrr,whosevsttXXad.Iyl,ycrtswtori."

Errrr, maybe not. :-)

.

*** I can never remember all the text ***

I just go do a search that brings it up and copy/paste it in.

If I am feeling lazy and/or I know the knowledge level of the reader is high, I just call them "omitted results".

helpnow

6:18 pm on Jul 7, 2008 (gmt 0)

10+ Year Member



Marcia - Please, ; ), feel free to digress - do you have another thread started for "those other on-page & sitewide factors are a subject for a whole other thread."?

Quadrille

7:43 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"Similarity Filter"

Perfect. Why didn't I think of that? ;)

tedster

8:35 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



those other on-page & sitewide factors are a subject for a whole other thread."

Many threads, in fact. See the Hot Topics [webmasterworld.com], which is always pinned to the top of this forum's index page. Also many more threads are available with our Site Search [webmasterworld.com].

Here's how I understand the issue. It particularly affects low PR pages. As the web grows, Google has taken steps to keep their main index fast by introducing some database partitions. The first and most infamous database partition was the "supplemental index", but there certainly may be more than one today. (discussion of the partition patent [webmasterworld.com])

URLs stored in these partitions are not as fully "tagged" for search as URLs in the main index. Unique text strings from the content area may not (I'd say do not) return that URL in the search results. But the meta description and the title elemant are two primary areas that DO still get tagged.

So if the title and/or description are duplicate or near duplicate, you've just thrown away the best chance for still showing up in the search results. The URL as it sits in the database partition now looks no different from other, accidentally duplicated URLs.

I've seen URLs jump back into the regular index even without any increase in link juice, just from giving them a full, unique title and description. In fact, I now even give paginated articles unique descriptions and titles - and I see more relevant traffic coming to deep article pages because of this.

[edited by: tedster at 9:05 pm (utc) on July 7, 2008]

jackson992

8:45 pm on Jul 7, 2008 (gmt 0)

10+ Year Member



The problem is sometimes the title can not be different for pages, especially if it is a dynamic site that is selling a certain kind of blue widgets say.

tedster

8:49 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You really have only two options, as I see it. Use a unique title and description, or get a ton of PR in the page.

As an alterntative approach, you might be better off only working to get one of the URLs into the main Google index and circulating your available link juice to fewer URLs. Use some form of "page rank scuplting" such as the rel="nofollow" attribute in links.

At any rate, I don't think the WMT information means anything like a penalty, but it is a heads up about how you could do better.

[edited by: tedster at 8:50 pm (utc) on July 7, 2008]

helpnow

8:50 pm on Jul 7, 2008 (gmt 0)

10+ Year Member



jackson992 - It HAS to be different, or you will run into problems. There must be something about each blue widget that makes it different enough that it deserves its own page in the first place - material used, color, size, something... And if there isn't, or you can't extract or qualify what is different, then disallow google with meta tags from seeing all of the different blue widgets - just show one main, primary blue widget. You have to do something, or your site will get hurt.

(Sorry tedster, I posted at the same time as you - didn't mean to interfere... ; ) )

Quadrille

9:34 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It's probably not the content that's identical - but the ability of the system to provide unique titles and descriptions to match the sheer number of URLs that get thrown up.

The solution for dynamic sites is often to use robots.txt or noindex to stop duplicate URLs (with duplicate titles etc., from appearing at all.

Most CMS systems are remarkably search engine unfriendly, and this kind of filtering really catches them out.

tedster

9:39 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Agreed. I've worked with many clients where an early step was modifying their CMS to give them the kinds of control that are really required.

jimbeetle

9:44 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



So Tedster, I can't quote good enough pieces from your 20758 post above, but now you've got me thinking that there we're actually talking about two duplicate content filters, one on the indexing end and one on the SERP delivery end. I think you pulled apart the discussions of both description and snippet and it elegantly makes a lot of sense.

We have a fairly good handle on how the dupe filter works on the delivery end: dupe title and dupe snippet and you're off to the omitted results. In general this is used to sniff out dupes or near dupes among different sites.

And for the indexing end the filter would apply to pages on one site and works off dupe titles and description. Google is basically saying that at first sniff those pages appear to be dupes so why waste processing power on them.

Dupe title plus dupe description among pages on the same site is an indexing filter mechanism.

Dupe title plus dupe snippet among pages on different sites is the delivery filter mechanism.

Logical?

tedster

10:09 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



That's a good way to sum it up, Jim. The database partition is not truly a "filter" but the end result is the same - no rankee!
This 51 message thread spans 2 pages: 51