Forum Moderators: Robert Charlton & goodroi
1. when you have the exact same text in the title and keyword tags on the same page
or
2. When you have the same titles and keywords on two different pages
In my case, when I was entering a lot of metatags I was duping the title into the keywords figuring I'd go back later (not duping from page to page but within the page as in #1) I'm wondering if I shot myself in the foot.
thanks
cg
However, some people have reported they suspect troubles with case #1. I haven't seen an example close up so cannot say for sure. I certainly have seen counter examples where title tag = meta description and the url still ranks.
However, it's a pretty weak implementation. Now that meta descriptions are in heavy play, they are worth some serious attention -- even to the point of modifying your CMS if necessary to allow good metas for any important url.
However, when talking about using the same words for both the title and the meta description on the same page, I would have to say that I would personally expect the description to be about 3 to 4 times longer than the title, and I might expect search engine algorithms to think that way too... so go back and improve them all when you have time. Even if you just do half a dozen a day.
Thanks to WW for the heads up.
personally couldn't care less how many pages they crawl just interested in accuracy of the search results.
I couldn't agree more.. If you have multiple pages on one site with the same content and keywords, why can't Google send the dupes to supplimental and rank the original page? Why is it all or nothing.
Lets stop the insanity. Google's convoluted indexing is putting a huge burden on webmasters to conform to their flawed alogrithm, and dimishing the quality of sites.. Moreover, they are making it more difficult for the average webmaster to get a decent ranking..
Sorry to rant, but this duplicate content issue really irks me.
<<< sell red-blu widgets, buy red-blu widgets, trade red blu widgets locally or worldwide>>
<<< sell White.black widgets, buy White.black widgets, trade White.black widgets locally or worldwide >>
don't know yet if this way they will be considered different.
the original description, the same for all the pages,
made all pages become supplemental
<< buy , sell, trade widgets locally and worwide ... >>
knowing that only 1 keyword diffrence was not enough , and Not been able to made 500 different descriptions I had not choice.
Hope a spammy looking description will not give me another penalty.
Meta descriptions being the same ( even with the titles and content being unique ) make pages fall out for keywords they're not unique to ( ie. anything that can be found on other pages as well ).
But this does not make pages supplemental.
This isn't some rule, this is just what we experienced.
If this is the only problem, updating the descriptions solves it.
During the crawls to follow, G picks them up and lists unique pages.
That's all.
Perhaps using a comma after every other word made G think that oh... someone is trying to use the description to list keywords? >=)
Not sure if there's such a filter but would make sense... actually.
This was enough to throw my site into supplemental hell. Of course, I added a forum about the same time and it was issuing session IDs left and right, along with a double slash "//" in some URLs, but it would work without the double slash as well. Can you say duplicate content?
The thing that gripes me is Google never used to pay any attention at all, or very little, to meta tag descriptions. Now they do. The hypocrisy of something going from almost no value to so important it can knock you out of the index is ridiculous. Did Google suddenly wake up and go, "Oops, we forgot to consider the meta descriptions in the algo. Better go back and flip that switch on..."
At any rate, I'm still waiting to emerge from supplemental hell. I've deleted most of the duplicate descriptions, put a big dent in writing new unique ones, and have got my server to not serve pages unless the URLs are correct.
I disallowed questionable parts of my site that I intend to rework in the coming months, so if Google will just pay attention now hopefully things will get better.
I am seeing some pages drop the supplemental results notation, but my site is still buried in the SERPs.
Same pages being accessible from many URLs may have been the primary reason. I mean we're talking about two different things here.
Same meta tag is not exactly duplicate content.
It's... "very similar" content... as the link for the "omitted results" reads, at the bottom of the last SERPs. Those omitted may or may not be supplemental, that's another issue.
Now the same page appearing on more than one URL is duplicate content.
Fair and square, the algo is yet to identify these URLs not being plagiarism (sp?), or not someone lazy enough to put up two sites with the same content.
When in fact most of the time they were but different routes to the same file on a server or the same content being pulled from a the same database. They just can't tell, for there are too many possibilities, hence they label each and every page where they seem to recall the content from somewhere else ( different URL ) as the duplicate of another page. Content stolen or reproduced. G always assumes the worst ;)
Meaning the original should stay, the rest should go.
But since they don't... can't spider the net realtime, not even when they want to compare such URLs... more often than not, they can't even tell... which one is the original? The result thus is... either the original is a goner, or all of these URLs are made supplemental. Why not. :P
Many people on WW (have) t(h)anked in G because of this high level of caution the last few months... right?
For instance, if a site has a mirror site, which wasn't all that rare until a few years ago, the main site would be untouched, while the mirror site is supplemental. Which would make sense.
Only that...
It doesn't work. >=)
So don't leave any pages accessible on more than one URL.
For that's dupe content.
Meta descriptions do not count as dupe content, they won't make your pages supplemental, only "omitted" when you or others have pages with the same relevancy only with a unique description.
And on the issue of meta tags...
When i was making a hobby site with friends, i was "optimizing" the pages. Only that i called it "design" back then :P
Even though Google was already saying meta tags were obsolete and we should not use them... we did just in case. Because they're there for a reason, because while included in the code they will make *some* difference, because at that time, altavista, yahoo, msn and who knows how many other SEs were yet to break with the habit of spiderig them.
And the site never suffered a single blow from google because it was optimized not only for the moment, but in retrospect of searches made in general.
That's all i have to say :)
Although i'm editing it for the 2nd... no 3rd time.
Typos everywhere.
I know my site has been hit with a duplicate content penalty, and I think my timing with Google was extrememly bad. My site tanked in November 2004, right in the middle of the month. Traffic plummeted within a 48 hour period, and has never really come back, although some of my pages have been #1 in the months since. Overall though, my site is difficult to find in Google.
I think the forum hurt me with its session IDs and double slashes, and a few months prior to that, I added a section to my site which uses javascript to display eBay auctions. Google can't see the links to the auctions, because they are generated when the browser loads the page, so I think those pages look very similar.
At any rate, that section has been disallowed via robots.txt now, many of the linking problems have been fixed with mod rewrite, thanks to jpMorgan!, and I've also disallowed a lot of the forum directories as well to reduce duplication. So, we'll see.
I am seeing some pages drop the supplemental results tag, so hopefully it's the beginning of a trend.
Oh, and regarding my meta description tags, they essentially said "Visit www.example.com for all the latest Widget information. We are Widget Central!"
So, not a great tag but that was the concensus at the time. Live and learn.
>> If you have multiple pages on one site with the same content and keywords, why can't Google send the dupes to supplimental and rank the original page? <<They do. That's what the dozen other threads about dupe content are all talking about.
I guess this has already been said: "Then if you have legitimate duplicate content/supplementals, don't worry about it."