Welcome to WebmasterWorld Guest from 34.204.191.31

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Is it critical issue for duplicate meta title and description?

     
2:17 am on Jul 25, 2015 (gmt 0)

New User

joined:July 25, 2015
posts:4
votes: 0


Hi, everyone, I don't know whether there are similar threads on this forum.
I have a website, there are 17,445 duplicate descriptions and 16,180 duplicate titles tested by hrefs.com, this reason of duplicate title and description is the editor is too lazy to fill the title and description, so the program will auto fill the default title and description.
I just want to know will this affect rank? Should I do some improvement?
Thank you.
7:24 am on July 26, 2015 (gmt 0)

Junior Member

joined:Apr 22, 2015
posts:55
votes: 34


Welcome to the forums hongyanhe.

In a perfect world each page would have a unique title and accurate description. However, some software for user generated content tends to create duplicates like that. If the overall percentage of duplicate titles and descriptions are not too great in relation to the total number of pages, I'd probably leave it alone. If the percentage is high, then I'd look at changing how the pages are put together.

You might be able to "force" users to generate their own titles rather than inheriting a title. I'd think about programmatic solutions if possible rather than editing type solutions.
8:09 am on July 26, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15869
votes: 869


With 17.445 duplicate descriptions you are probably better off setting the code to make no description at all. (Interestingly, GWT/Search-thingy will yap about long/short descriptions, and possibly about duplicates, but they don't seem to care if a description is missing entirely.)

Can I assume you don't have 16,180 pages with the identical <h1> text? Maybe you could use that for the title instead. There's got to be something unique, or the pages wouldn't exist as separate URLs in the first place.
12:59 pm on July 26, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:13012
votes: 222


I agree about just leaving meta description blank. I would still probably spend some time on the page titles though. Maybe not all of them, but at least the most important pages.

Why would a search engine want to index and rank so many pages with the same title and meta description? Being too lazy to write page titles is probably not a signal I'd want to give out.
2:19 pm on July 26, 2015 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:June 19, 2005
posts: 369
votes: 18


I run a forum that generates a massive amount of duplicate questions. Last year I ended up removing the meta description and I have no negative effects. Google seems to generate decent descriptions just fine. I also am manually fixing the 13k duplicate titles. I do believe that is important because you'll likely only rank for one of the titles so you are losing out of opportunities.
3:38 pm on July 26, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 14, 2008
posts:2910
votes: 62


I'd definitely drop the descriptions like others have said, and I'd try to do something "unique" with the titles too like they've suggested, but if I couldn't for some reason EG there was no unique heading on the page either and the other more complicated work-arounds I can think of just weren't the answer either, personally, with all the title changing going on anyway these days, I'd probably set the default title to nothing more than the word Page, leave 'em duplicated, ignore hrefs and let Google decide what to show in the results.
5:26 pm on July 26, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member jimbeetle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 26, 2002
posts:3295
votes: 9


I'd probably set the default title to nothing more than the word Page, leave 'em duplicated, ignore hrefs and let Google decide what to show in the results

What? The page title, the most important element on the page, one of two chances we have to tell the search engines what the page is about, and you're going to let Google, as well as the other SEs, sort it out themselves?

And what do dupe title elements tell the SEs about the quality of a site?

"Leave 'em duplicated" tells the SEs not to waste their resources on 'em. Remember Google's no longer talked about supplemental index? Pages with dupe titles and descriptions went to straight to purgatory until Google somehow got around to sorting them out. Though no longer marked as supplemental in the SERPs, has this changed? How do other SEs handle dupe titles and descriptions? Shouldn't we be talking best practices instead of work-arounds?
7:08 pm on July 26, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15869
votes: 869


I'd probably set the default title to nothing more than the word Page

Is the idea here to use something that sounds auto-generated? (New Document, Untitled, blank ... my test site has a bunch of 'em, intentionally, all different.) So the search engine thinks there's a minor glitch in your software and simply ignores the whole thing, while if you had 10,000+ pages all called "Purple Widget model X370" it's flagrant duplication?

The <title> is the only thing the w3 validator absolutely insists upon within <head>. So there's that. Unlike a meta description, it would not be a good idea to leave it out.

:: wandering off to see what the various browsers display if they receive a page with no <title> element ::
10:59 pm on July 26, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 14, 2008
posts:2910
votes: 62


What?

Just read the whole post. It makes more sense that way, along the same lines as "I wouldn't use meta refreshes in place of redirects, but if I couldn't use a redirect for some reason, I would use a meta refresh." -- Not something I would normally do, but if the options were what I said or nothing, I'd go with what I said, even if the why behind my reasoning might really confuse some people.
I'd definitely drop the descriptions like others have said, and I'd try to do something "unique" with the titles too like they've suggested, but if I couldn't for some reason EG there was no unique heading on the page either and the other more complicated work-arounds I can think of just weren't the answer either...



Is the idea here to use something that sounds auto-generated? (New Document, Untitled, blank ... my test site has a bunch of 'em, intentionally, all different.) So the search engine thinks there's a minor glitch in your software and simply ignores the whole thing, while if you had 10,000+ pages all called "Purple Widget model X370" it's flagrant duplication?

Bingo!
1:21 am on July 27, 2015 (gmt 0)

New User

joined:July 25, 2015
posts:4
votes: 0


Thanks all
All of my important pages' title are duplicated title, I think this is a serious problem.
There is a problem occur that the main page without rank in yahoo, but the other page(which not important, and it has the same title,description, keywords with the main page) got rank, and the rank wasn't steady, it will disappear any time.
I don't know this problem whether caused by the duplicate problem.
3:14 am on July 27, 2015 (gmt 0)

Junior Member

joined:Apr 22, 2015
posts:55
votes: 34


On a re-read of your original post:

this reason of duplicate title and description is the editor is too lazy to fill the title and description, so the program will auto fill the default title and description.
I just want to know will this affect rank?


It now seems to me that you are using something to generate pages, not users generating content. There are some programs that can produce some decent quality spun content, but it sounds like you don't have a program like that.

I don't think massive page building without substance behind those pages is a real viable concept in today's environment unless the site can support those pages. Best of luck O.P.
3:45 am on July 27, 2015 (gmt 0)

New User

joined:July 25, 2015
posts:4
votes: 0


It now seems to me that you are using something to generate pages, not users generating content.

Yes, we just generate page title, description, keywords, the content is unique wrote by our editor.
4:22 am on July 27, 2015 (gmt 0)

Junior Member

joined:Apr 22, 2015
posts:55
votes: 34


Many years ago, the guy (BT) who started WebmasterWorld made a thread about how to make a successful website in Google in one year's time. I'd provide you a link, but I lost track of that thread quite a while ago.

One of the points was to produce one page of content per day. Since it would take about 44 years to produce more than 16,000 pages of content if you were doing one quality page per day, there's obviously an issue.

I don't think the existing site can be fixed with the damage that has been done. A startover would be a better option.
4:30 am on July 27, 2015 (gmt 0)

New User

joined:July 25, 2015
posts:4
votes: 0


Startover is unenforceable. It needs too much time.
don't think the existing site can be fixed with the damage that has been done.

Is this real? If yes, why so many site need revision?
5:00 am on July 27, 2015 (gmt 0)

Junior Member

joined:Apr 22, 2015
posts:55
votes: 34


Part of dealing with realities on the web is that sometimes projects fail.
People try to salvage those projects, but it makes more sense to do it right the second time.

I have many failed projects, but I learned from every one. And that is how you build successful projects.
5:06 am on July 27, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 14, 2008
posts:2910
votes: 62


If the content is unique:

Delete the description > Make it description="" if you can't delete it completely.
(It doesn't "count" any more anyway.)

Delete the keywords > Make it keywords="" if you can't delete it completely.
(They don't "count" any more anyway.)

Make the titles unique and related to the page content if at all possible > If you can't make the titles unique and related to the text of the page use something that's *not* likely to be considered "title spamming to rank oriented" so it will just be ignored if it's there by making it something like: Page, New Document, Information, Document, Article [Date of Post] where [Date of Post] is the actual date of the post, or possibly the best "mostly duplicated, not topically oriented but still not trying to spam titles for ranking purposes" would be Article [ID] - Posted [Date of Post] - Site Name where [ID] is the ID of the article and [Date of Post] is the actual date of the post.

I think that's how I would approach the situation if I had unique content at least.



If the content is not unique, then I'd probably go with the "start over" suggestion Rob_Banks made.
4:33 pm on July 27, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:13012
votes: 222


Yep. As is, this site is likely never rank for anything useful. The business model won't work in 2015.
6:15 pm on July 27, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member jimbeetle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 26, 2002
posts:3295
votes: 9


Just read the whole post.

I did. And I see no good reason for a site not having unique page titles.
11:10 pm on July 27, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 14, 2008
posts:2910
votes: 62


Well, in a "perfect world", there isn't, unfortunately, the world I (and many webmasters it seems) live in isn't perfect, so sometimes we have to do the best we can with what we have to work with.

One "along the same lines" example I can think of is someone I'm working with has a site with pages that take 4 to 5 seconds to load... They got "taken" by a programmer who sold them on a system that would work and does when it's running, but...

Unfortunately, it puts their server under such a high load it slows the site down, crashes their databases, is completely bloated and reinvents the wheel consistently throughout the code -- In a perfect world, they wouldn't have to deal with the issues the system creates, but they do, and if they couldn't afford me or someone like me to recode the whole thing from the ground up, there would be nothing they could do about it (except hoping doubling the server resources to 4 processors and 20GB RAM helped), even though a 4+ second page load time is completely unacceptable in 2015, and there's "no good reason" to have it take that long.



Maybe the OP in this thread is in a similar situation, where they "got sold" on something they thought was okay or would work, but unfortunately, can't afford to have someone come in and recode it to programmatically create unique titles from some garbled (or even not garbled) HTML, which is entirely possible, since programmatically "stripping a page down to content only", then "finding the topic", then "automatically generating unique titles based on the topic of the page content" with anything "meaningful" rather than "keyword 1 keyword 2 keyword 3 keyword 4" is very challenging and requires a great deal of skill -- If you don't believe me, try doing it sometime. It's *very* difficult to do with a script, even if it's only to be used on one site where you have some "set boundaries" to look within to find the info.


Bottom line is S*** Happens sometimes, and when it does, sometimes we just have to figure it out the best we can from "where we are now" -- Having someone tell us there's no good reason for it happening in the first place really isn't very helpful or useful, IMO, because that's not a solution it's just a condescending, holier-than-thou slap in the face.
4:37 am on July 28, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 21, 2002
posts: 772
votes: 14


Many years ago, the guy (BT) who started WebmasterWorld made a thread about how to make a successful website in Google in one year's time. I'd provide you a link, but I lost track of that thread quite a while ago.

Successful Site in 12 Months with Google Alone [webmasterworld.com...]

I keep a copy of it printed out at my desk. It gets used for each new site launch <G>