Forum Moderators: Robert Charlton & goodroi
But I'm not at all sure that this factor can CAUSE traffic drops if duplicate meta descriptions are introduced - or even more, if they've always been there then suddenly they are now a ranking problem that never existed before.
As far as I know, the traffic would never have been there in the first place if meta description problems give the page a low value indexing (what was known as the supplemental index).
We rank well with Google yet GWMT show we have over 9,000 duplicate title tags and descriptions. These dupes are being generated from our forum so when someone asks "What are Widgets?" and there are 30 replies, the title and description tags will be the same for each reply thus 30 dupes.
Matt Cutts clarified some stuff two years ago: [threadwatch.org...]
[edited by: tedster at 10:08 am (utc) on Sep. 9, 2008]
[edit reason] make url clickable [/edit]
also each location would have different content on the page.
<snip>
would I have to create unique meta tags for each location?
and after that would you have to submit a reinclusion request or just wait for google to spider the site again?
[edited by: Receptional_Andy at 8:09 am (utc) on Sep. 11, 2008]
[edit reason] Removed specifics as per charter [/edit]
Incidentally, the only time a reinclusion (now called "reconsideration") request is necessary is if you can confirm that your site has been penalised - which is not the case with the potential issues created by dupe/similar content.
Toolbar PageRank is a slightly different animal, and Google seem to play with that in order to toy with webmasters. You might see the grey bar phenomenon [webmasterworld.com] on pages that are duplicated or overly similar.
Generally speaking, though, PR is unrelated to content.
I don't see sites with duplicate titles (same titles for many pages) penalized but I have seen sites or pages which have the same title and meta tag description penalized or troubled.
Duplication of this kind was targeted by the phrase-based spam algo because it's a signal of overoptimization.
You have to either delete the Description or change it. If it's a big site and you can't change all identical Descriptions quickly, delete them (except for the main landing pages, including the home page).
I've had sites recover by doing little to nothing more than deleting Descriptions.
When the Title and Description are the same it looks to Google like autogenerated web pages which are created to populate the text of the description tag with the text of the title tag. Lazy or rushed webmasters have also been known to copy and paste the title tag into the Description tag. Same problems result.
p/g
I'd say that the presence of only minor changes (e.g. one word) to a title and description is borderline - you'll likely get away with it if the site is strong enough, but you'd be well advised to have something much more unique for each page.
I found that simply adding a page number to the end of the title was enough to differentiate it from other titles and pull it out of dupe status. Not sure if that will work with much longer descriptions though. In the end I think it's worth the effort to hand edit them, and google makes it so much easier now that it identifies the paired dupes for you in WMT.
I'm not sure if Google was tweaking the algo over the summer, but something sure killed my traffic from May through August. I assumed the return of most of that lost traffic was due to cleaning up dupes...but perhaps the tweaking is now working in my favor? I did gain many new links...but at a low level, i.e. social bookmark links on pages with no rank or nofollowed links.
So, having been alerted to problems in WMT, fix those - and then get looking with site:domain.com searches to find the other ones that didn't quite get flagged in WMT, but which nonetheless are still problematical.