Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Duplicate meta descriptions - now a major problem on Google?

         

ichthyous

8:12 pm on Sep 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



My site started losing traffic in May and by July had lost 40% of it's traffic. In July WMT also started reporting large numbers of dupe titles and descriptions. I was able to clean up most of the dupe titles from 1,200 down to 13 now and I have seen a return of about 20-25% of the lost traffic in the last two weeks. I still have quite a few (730) duplicate descriptions though that will almost all have to be hand edited. Are dupe meta descriptions are heavily penalized as dupe page titles?

tedster

9:29 pm on Sep 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've been hearing a lot of this kind of comment and I'm still not sure what exactly is going on. Yes, Google does seem to care a lot about meta descriptions these days, whether too short or duplicate. See this recent discussion:
[webmasterworld.com...]

But I'm not at all sure that this factor can CAUSE traffic drops if duplicate meta descriptions are introduced - or even more, if they've always been there then suddenly they are now a ranking problem that never existed before.

As far as I know, the traffic would never have been there in the first place if meta description problems give the page a low value indexing (what was known as the supplemental index).

travelin cat

9:49 pm on Sep 8, 2008 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I think it depends on what is in the content of the pages with duplicate titles or descriptions.

We rank well with Google yet GWMT show we have over 9,000 duplicate title tags and descriptions. These dupes are being generated from our forum so when someone asks "What are Widgets?" and there are 30 replies, the title and description tags will be the same for each reply thus 30 dupes.

CainIV

2:04 am on Sep 9, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As well, a factor to consider is inbound links, which seem to be the trump card, even when there are heavy duty amounts of duplicate meta descriptions on the site.

g1smd

9:31 am on Sep 9, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



This has been a factor for at least two or three years, but is getting more attention in recent months.

Matt Cutts clarified some stuff two years ago: [threadwatch.org...]

[edited by: tedster at 10:08 am (utc) on Sep. 9, 2008]
[edit reason] make url clickable [/edit]

JoeSinkwitz

10:17 pm on Sep 10, 2008 (gmt 0)

10+ Year Member



I've been seeing a lot more sites with dupe meta descriptions get nailed and sent back 5-7 pages for every query; IMHO the filter is a bit tight, but that's summertime tweaking for you.

ziajunu

12:13 am on Sep 11, 2008 (gmt 0)

10+ Year Member



What if a site doesnt have meta desc? only title.

g1smd

6:31 am on Sep 11, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Google uses text from the page in that case. That's often far from optimum.

Have a look at the snippets when you use the site:domain.com search for example.

htdawg

7:59 am on Sep 11, 2008 (gmt 0)

10+ Year Member



what about sites that have almost the same meta descriptions & titles?
for example a site <covering different locations> that uses the same titles & desc. but only the location changes is that considered duplicate content?

also each location would have different content on the page.

<snip>

would I have to create unique meta tags for each location?

and after that would you have to submit a reinclusion request or just wait for google to spider the site again?

[edited by: Receptional_Andy at 8:09 am (utc) on Sep. 11, 2008]
[edit reason] Removed specifics as per charter [/edit]

Receptional Andy

8:16 am on Sep 11, 2008 (gmt 0)



I'd say that the presence of only minor changes (e.g. one word) to a title and description is borderline - you'll likely get away with it if the site is strong enough, but you'd be well advised to have something much more unique for each page.

Incidentally, the only time a reinclusion (now called "reconsideration") request is necessary is if you can confirm that your site has been penalised - which is not the case with the potential issues created by dupe/similar content.

htdawg

8:27 am on Sep 11, 2008 (gmt 0)

10+ Year Member



sorry about being too specific with the examples.

also would duplicate descriptions & titles be one of the causes for loss of PR for a site or pages from that site?

Receptional Andy

8:31 am on Sep 11, 2008 (gmt 0)



PageRank is entirely based on external links to a site.

Toolbar PageRank is a slightly different animal, and Google seem to play with that in order to toy with webmasters. You might see the grey bar phenomenon [webmasterworld.com] on pages that are duplicated or overly similar.

Generally speaking, though, PR is unrelated to content.

potentialgeek

12:14 pm on Sep 11, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> Are dupe meta descriptions [as] heavily penalized as dupe page titles?

I don't see sites with duplicate titles (same titles for many pages) penalized but I have seen sites or pages which have the same title and meta tag description penalized or troubled.

Duplication of this kind was targeted by the phrase-based spam algo because it's a signal of overoptimization.

You have to either delete the Description or change it. If it's a big site and you can't change all identical Descriptions quickly, delete them (except for the main landing pages, including the home page).

I've had sites recover by doing little to nothing more than deleting Descriptions.

When the Title and Description are the same it looks to Google like autogenerated web pages which are created to populate the text of the description tag with the text of the title tag. Lazy or rushed webmasters have also been known to copy and paste the title tag into the Description tag. Same problems result.

p/g

ichthyous

2:48 pm on Sep 11, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'd say that the presence of only minor changes (e.g. one word) to a title and description is borderline - you'll likely get away with it if the site is strong enough, but you'd be well advised to have something much more unique for each page.

I found that simply adding a page number to the end of the title was enough to differentiate it from other titles and pull it out of dupe status. Not sure if that will work with much longer descriptions though. In the end I think it's worth the effort to hand edit them, and google makes it so much easier now that it identifies the paired dupes for you in WMT.

I'm not sure if Google was tweaking the algo over the summer, but something sure killed my traffic from May through August. I assumed the return of most of that lost traffic was due to cleaning up dupes...but perhaps the tweaking is now working in my favor? I did gain many new links...but at a low level, i.e. social bookmark links on pages with no rank or nofollowed links.

g1smd

3:51 pm on Sep 11, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



WMT shows only some of the issues. The threshold to generate a warning is higher in WMT than it is to cause a problem in the real SERPs.

So, having been alerted to problems in WMT, fix those - and then get looking with site:domain.com searches to find the other ones that didn't quite get flagged in WMT, but which nonetheless are still problematical.

ichthyous

4:10 pm on Sep 11, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Now that the supplemental tags are gone how do you identify problematic pages using site:domain.com?

g1smd

5:14 pm on Sep 11, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The "click for omitted results" (Google) option is helpful. Other than that, just knowing the site inside out and having an eye for detail.

Xenu LinkSleuth can be useful.