Duplicate meta descriptions happens alot more often than you would think. Once reason for this is people install multiple seo plugins into their CMS. I recently had to clean up a wordpress blog that had 3 meta descriptions from 3 different plugins that were added over time and never cleaned up.
Best case scenario is that Google just ignores the extra meta data but that is still hurting your rankings? How? Well you have wasted a good amount of time working on something that isn't helping your site rank higher. In extreme cases it can even slow down sites (think multiple big & redundant descriptions on a mobile site).
Sorry I should have explained myself better. I am referring to the same meta description on multiple pages (like what is reported in Google Web Master Tools).
For example on user generated pages like forum posts I just use the sites primary meta description since the user is not asked to enter one when they create a forum topic. The forum title could be used but that is what I use the page title for. I could also try to create one automatically by using the first x number of characters from the first forum post on the page but the text itself may not make any sense.
Does anyone think it is worth it (ie spending the time to recode things) to try to make all pages have unique meta descriptions? Or is it enough that 75% of the pages have a unique meta description and the other 25% or so share one?
Dupe titles and descriptions were the first two things Google used to look at when sniffing out possible duplicate pages, back when the supplementary index was a concern.
Don't know if they still test these as an initial dupe sniff test, but I'd stay away from them if at all possible. Best practice a few years ago was that it was better to do without a description rather than have dupes.
Small percentage of the overall content, it probably doesn't matter. Like, less than 10 or 15%, I wouldn't worry about.
|Don't know if they still test these as an initial dupe sniff test, but I'd stay away from them if at all possible. Best practice a few years ago was that it was better to do without a description rather than have dupes. |
Really... I didn't think of not having a meta description.
How about a topic that lists items. My topics each have their own meta descriptions but if the topic spans multiple pages (which most do) each of those pages share the same meta description. Should the meta description for page 3 of the topic then be something like:
Some topic meta description here - Page 3
|For example on user generated pages like forum posts I just use the sites primary meta description since the user is not asked to enter one when they create a forum topic. |
I'd leave it off personally.
It's not an independent scoring factor as far as I've seen in testing or heard from testing at all recently plus if your "primary" description doesn't "fit" the topic of the results the page is returned for -- which I'm guessing it won't in most cases -- then either it's going to:
(best case) be changed by Google anyway.
(okay case) not be changed by Google and not generate a "more compelling reason to click" -- not hurting but not helping.
(worst case) not be changed by Google not match the title in the SERPs or the query and generate fewer clicks than if Google used some of the text from the page -- could possibly hurt if you rely on Google and the algo doesn't do what you need it to for more clicks.
Treat it the same way as "crawl errors". If you look and your reaction is "Yes, yes, what's your point?" then ignore it. But if you look and say "Whoops!" then the robot has given you information. Last time I personally got a "duplicate meta" notice it was a page I'd created by copying an existing page and changing all its content. Forgot to change the meta description, so there you are. Thank you, robot, wouldn't have noticed that.
Now, if only they'd stop ragging about short metas. Look, search engine, sometimes "about widgets" is all the human user needs to know.
Thanks for the replies.
@TheOptimizationIdiot - For the forum posts at least Google always changes the meta description as far as I can tell. But ... to be on the safe side I have just turned off using the primary meta description if none is given for the page. My rankings have been hurting so I might as well give this a go to see if it improves rankings, even though I expect lucy24 is right and it will probably not really make much of a difference.
The Answer is "YES"(with respect to my experience). I recently have had the same issue with my site. All the pages on site's blog were having same meta description which, i think, Google, minded. As a result, the SERPS were all over the place. I lost my top positions. But, sooner i fixed the description issue and Google placed my kws on top. I am not an SEO expert but i assume it was Meta Description hurting the rankings in SERPs.
@Faizan_Khan - Well here's hoping it helps then. I will report back if I notice any changes.
|Does having duplicate meta descriptions hurt rankings? |
I have the same issue on multiple pages, generally for paginated pages. I do however noindex,crawl all of my paginated pages though. I still get the errors being reported, but I haven't seen any negative ranking issues because of it. If the description is valid for the page it's on, why worry about changing it?
The problem with Meta descriptions, in general, they don't show when the search phrase isn't included in the Meta Description thus assuming you have a large volume of longtail traffic (as most domains do) your compelling Meta is not so compelling to those searches which begs the question "what is showing?"
I deleted meta descriptions in 2004 when longtail was coined.
I'll be honest, I haven't even bothered writing meta descriptions for a couple years. Google seems to show their own description in the search results anyway, so I kind of stopped caring. Haven't seen a lick of difference in search referrals.
Having duplicate meta description tags isn't necessarily hurting a site--but it's not helping.
I'm a big proponent of having a very "clean" site, and fixing everything that can be fixed:
- links to pages that redirect to other pages
- links to pages that have 404 errors
- unique title tag on every page of the site
- unique image alt tag attributes on every image
- unique meta description tag on every page of the site
I put having a unique meta description tag on every page of the site just as important than having a unique title tag and properly adding an image alt attribute on all images.
I deal with a lot of pagination challenges in article sites that I deal with. I either omit the meta description for internal pages (as others in this discussion have suggested) or hand tailor a unique description for page two and on by cherry picking a sweet phrase or sentence from the page and using that. In those cases, I try to visualize the final SERP and make sure that the description is intriguing and promising.
I can't say that either way has given better results - they have both worked well for me.
FYI, I know this is a G SEO forum but I also SEO for B/Y a lot lately (since G seems to hate us so much while the others love us) and B Webmaster tools specifically flag a "missing description" and "description too long or too short" as "HIGH severity" SEO blunders under their SEO report. Since a lot of other factors appear similar to G's and G doesn't specify one way or the other, beware.
I've got a similar problem where I have lots and lots of pages that have almost identical meta descriptions and have been thinking about testing something new.
Any suggestions for testing new meta descriptions? A/B testing? Change a few and track those more closely?
Is it even worth testing a slightly different meta with more call-to-action in it? Or is the improvement in CTR with a slightly better meta description too low to bother?
|Any suggestions for testing new meta descriptions? |
|Change a few and track those more closely? |
|Is it even worth testing a slightly different meta with more call-to-action in it? |
I'd test different lengths too ;)
|B Webmaster tools specifically flag a "missing description" and "description too long or too short" as "HIGH severity" SEO blunders under their SEO report. Since a lot of other factors appear similar to G's and G doesn't specify one way or the other, beware. |
Similar but not identical, darn it. Their cutoffs for "too short" are at least 16 characters apart. Yes, I counted. Google's appears to be a clean 50, so that puts bing at 30 or less, assuming multiples of 5.
Thank You! Now I won't have to :) lol
Good meta description generates a greater CTR, we should avoid duplicate meta description. Put unique meta description with keywords.
It rarely helps in serp but helps in generating traffic to our site.
I have a website that has international presence and has different country specific versions like abc.com/us, abc.com/uk. Their content is exactly same but caters to different regions.
Will my rankings be affected if i have duplicate metas and titles for these pages?
Looking forward to your replies.
It's been a long time since I paid attention to meta descriptions... in other words, I don't bother these days and have seen no fall out in that regard. I think the search engines have depreciated meta because of abuse and are looking for the actual page content to present in search. Makes sense...
I do not agree with you completely. Yes you may say that meta descriptions are of less importance than earlier. But we have not reached a point where they can be ignored completely like meta keywords
welcome to WebmasterWorld, kshitija!
are you geotargeting by subdirectory in GWT?
Thank you phranque.
Yes, i am geo targeting these pages in Google webmaster tools. But still these pages are shown up in HTML improvements with duplicate titles and metas.