Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Need clarification on Title/keyword dupe issue

On same page or in the site?

         

cmendla

11:45 pm on Oct 10, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member




I'm not sure what is meant by the duplicate issues with the titles and keywords is it

1. when you have the exact same text in the title and keyword tags on the same page

or

2. When you have the same titles and keywords on two different pages

In my case, when I was entering a lot of metatags I was duping the title into the keywords figuring I'd go back later (not duping from page to page but within the page as in #1) I'm wondering if I shot myself in the foot.

thanks

cg

tedster

12:12 am on Oct 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



We were talking about case #2 -- different urls, same title and meta description

However, some people have reported they suspect troubles with case #1. I haven't seen an example close up so cannot say for sure. I certainly have seen counter examples where title tag = meta description and the url still ranks.

However, it's a pretty weak implementation. Now that meta descriptions are in heavy play, they are worth some serious attention -- even to the point of modifying your CMS if necessary to allow good metas for any important url.

g1smd

1:05 am on Oct 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



A comment by Matt Cutts [threadwatch.org] may help you here. He explains how they hide away multiple pages with the same titles or with the same meta description data.

However, when talking about using the same words for both the title and the meta description on the same page, I would have to say that I would personally expect the description to be about 3 to 4 times longer than the title, and I might expect search engine algorithms to think that way too... so go back and improve them all when you have time. Even if you just do half a dozen a day.

Monkscuba

2:21 am on Oct 11, 2006 (gmt 0)

10+ Year Member



Very interesting. At my previous job, I always made the descriptions long. I've been at this new job about a year, and have now finally taken responsibility for web updates from the boss (who is also a bit of a web nerd like me and didn't really want to give it up!) I'll talk with him today, because he has so many pages, in fact most of them I'd say, with descriptions the same as title and no extra text. Also product subpages which all have the same titles and descriptions as each other. So Google is attaching more importance to meta descriptions now? I always thought is was important, which is why I paid attention to them at the old job. Looks like we need some work on our sites now.

Thanks to WW for the heads up.

Pirates

2:28 am on Oct 11, 2006 (gmt 0)



Its crap isn't it these days google's rush for content has turned it into a meta search engine. I personally couldn't care less how many pages they crawl just interested in accuracy of the search results.

triumph

3:16 am on Oct 11, 2006 (gmt 0)

10+ Year Member



personally couldn't care less how many pages they crawl just interested in accuracy of the search results.

I couldn't agree more.. If you have multiple pages on one site with the same content and keywords, why can't Google send the dupes to supplimental and rank the original page? Why is it all or nothing.

Lets stop the insanity. Google's convoluted indexing is putting a huge burden on webmasters to conform to their flawed alogrithm, and dimishing the quality of sites.. Moreover, they are making it more difficult for the average webmaster to get a decent ranking..

Sorry to rant, but this duplicate content issue really irks me.

g1smd

9:28 am on Oct 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>> If you have multiple pages on one site with the same content and keywords, why can't Google send the dupes to supplimental and rank the original page? <<

They do. That's what the dozen other threads about dupe content are all talking about.

fraudcop

10:08 am on Oct 11, 2006 (gmt 0)

10+ Year Member



having to modify too many descriptions, to save time and to make them all different, I used 3 times the same keyword inside each description.

<<< sell red-blu widgets, buy red-blu widgets, trade red blu widgets locally or worldwide>>

<<< sell White.black widgets, buy White.black widgets, trade White.black widgets locally or worldwide >>

don't know yet if this way they will be considered different.

g1smd

10:12 am on Oct 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



They are different, sure, but now they might be rated as spammy....

What do you think your visitors reaction to that wording will be when they see that in the snippet?

Will they click?

fraudcop

11:33 am on Oct 11, 2006 (gmt 0)

10+ Year Member



g1smd

the original description, the same for all the pages,
made all pages become supplemental

<< buy , sell, trade widgets locally and worwide ... >>

knowing that only 1 keyword diffrence was not enough , and Not been able to made 500 different descriptions I had not choice.

Hope a spammy looking description will not give me another penalty.

photopassjapan

12:11 pm on Oct 11, 2006 (gmt 0)

10+ Year Member



Um...
We're talking about meta descriptions right?
NOT keywords.
This is important.

Meta descriptions being the same ( even with the titles and content being unique ) make pages fall out for keywords they're not unique to ( ie. anything that can be found on other pages as well ).

But this does not make pages supplemental.
This isn't some rule, this is just what we experienced.

If this is the only problem, updating the descriptions solves it.
During the crawls to follow, G picks them up and lists unique pages.
That's all.

Perhaps using a comma after every other word made G think that oh... someone is trying to use the description to list keywords? >=)
Not sure if there's such a filter but would make sense... actually.

g1smd

12:19 pm on Oct 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I would hope and expect that the Algo had a section like this:

"Is meta description written in flowing language (one point)?"
"Is meta description a list of keywords (minus one point)?"

AndyA

12:34 pm on Oct 11, 2006 (gmt 0)

10+ Year Member



My site had unique page titles for each page, although some of them did not vary by much, but they did describe what the page was about. My meta tag descriptions were somewhat generic, not exactly the same on all pages, but very similar on most and exact duplicates on some pages.

This was enough to throw my site into supplemental hell. Of course, I added a forum about the same time and it was issuing session IDs left and right, along with a double slash "//" in some URLs, but it would work without the double slash as well. Can you say duplicate content?

The thing that gripes me is Google never used to pay any attention at all, or very little, to meta tag descriptions. Now they do. The hypocrisy of something going from almost no value to so important it can knock you out of the index is ridiculous. Did Google suddenly wake up and go, "Oops, we forgot to consider the meta descriptions in the algo. Better go back and flip that switch on..."

At any rate, I'm still waiting to emerge from supplemental hell. I've deleted most of the duplicate descriptions, put a big dent in writing new unique ones, and have got my server to not serve pages unless the URLs are correct.

I disallowed questionable parts of my site that I intend to rework in the coming months, so if Google will just pay attention now hopefully things will get better.

I am seeing some pages drop the supplemental results notation, but my site is still buried in the SERPs.

photopassjapan

1:08 pm on Oct 11, 2006 (gmt 0)

10+ Year Member



All i can say is... again...
Descriptive but same meta description won't bury you as supplemental.
In case that was your sole problem.

Same pages being accessible from many URLs may have been the primary reason. I mean we're talking about two different things here.

Same meta tag is not exactly duplicate content.

It's... "very similar" content... as the link for the "omitted results" reads, at the bottom of the last SERPs. Those omitted may or may not be supplemental, that's another issue.

Now the same page appearing on more than one URL is duplicate content.

Fair and square, the algo is yet to identify these URLs not being plagiarism (sp?), or not someone lazy enough to put up two sites with the same content.
When in fact most of the time they were but different routes to the same file on a server or the same content being pulled from a the same database. They just can't tell, for there are too many possibilities, hence they label each and every page where they seem to recall the content from somewhere else ( different URL ) as the duplicate of another page. Content stolen or reproduced. G always assumes the worst ;)

Meaning the original should stay, the rest should go.
But since they don't... can't spider the net realtime, not even when they want to compare such URLs... more often than not, they can't even tell... which one is the original? The result thus is... either the original is a goner, or all of these URLs are made supplemental. Why not. :P

Many people on WW (have) t(h)anked in G because of this high level of caution the last few months... right?

For instance, if a site has a mirror site, which wasn't all that rare until a few years ago, the main site would be untouched, while the mirror site is supplemental. Which would make sense.
Only that...

It doesn't work. >=)

So don't leave any pages accessible on more than one URL.
For that's dupe content.

Meta descriptions do not count as dupe content, they won't make your pages supplemental, only "omitted" when you or others have pages with the same relevancy only with a unique description.

And on the issue of meta tags...

When i was making a hobby site with friends, i was "optimizing" the pages. Only that i called it "design" back then :P

Even though Google was already saying meta tags were obsolete and we should not use them... we did just in case. Because they're there for a reason, because while included in the code they will make *some* difference, because at that time, altavista, yahoo, msn and who knows how many other SEs were yet to break with the habit of spiderig them.

And the site never suffered a single blow from google because it was optimized not only for the moment, but in retrospect of searches made in general.

That's all i have to say :)
Although i'm editing it for the 2nd... no 3rd time.
Typos everywhere.

Broadway

2:28 pm on Oct 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If everything old is new again, possibly I should review and idealize my Keywords meta tags when reviewing and revising my Description meta tags, as a way of being in position for future algo changes.

photopassjapan

2:51 pm on Oct 11, 2006 (gmt 0)

10+ Year Member



You know... the exact same thing crossed my mind while typing the previous post -.-

AndyA

3:33 pm on Oct 11, 2006 (gmt 0)

10+ Year Member



I am doing my meta keywords at the same time I'm updating the description tags.

I know my site has been hit with a duplicate content penalty, and I think my timing with Google was extrememly bad. My site tanked in November 2004, right in the middle of the month. Traffic plummeted within a 48 hour period, and has never really come back, although some of my pages have been #1 in the months since. Overall though, my site is difficult to find in Google.

I think the forum hurt me with its session IDs and double slashes, and a few months prior to that, I added a section to my site which uses javascript to display eBay auctions. Google can't see the links to the auctions, because they are generated when the browser loads the page, so I think those pages look very similar.

At any rate, that section has been disallowed via robots.txt now, many of the linking problems have been fixed with mod rewrite, thanks to jpMorgan!, and I've also disallowed a lot of the forum directories as well to reduce duplication. So, we'll see.

I am seeing some pages drop the supplemental results tag, so hopefully it's the beginning of a trend.

Oh, and regarding my meta description tags, they essentially said "Visit www.example.com for all the latest Widget information. We are Widget Central!"

So, not a great tag but that was the concensus at the time. Live and learn.

triumph

4:00 pm on Oct 11, 2006 (gmt 0)

10+ Year Member



>> If you have multiple pages on one site with the same content and keywords, why can't Google send the dupes to supplimental and rank the original page? <<

They do. That's what the dozen other threads about dupe content are all talking about.

I guess this has already been said: "Then if you have legitimate duplicate content/supplementals, don't worry about it."