Forum Moderators: Robert Charlton & goodroi
Meta tags and more - from <head> to </head> [webmasterworld.com]
For Google, although not required, the meta description can be key in helping your pages not to be seen as nearly duplicate and therefore ending up in the supplemental index. This area is a very "hot topic" right now. Here are two reference threads that should bring you up to speed.
Duplicate Content [webmasterworld.com]
Supplemental Results [webmasterworld.com]
I would say that it is better to make sure you have everything else lined up before worrying about adding meta descriptions to pages that don't already have them.
Of course it is. If the description really describes what is on the page, then it's only normal that it matches some of the text.
Google thinks logical. Think logical as well and you'll get good results.
I think google also uses meta discriptions as one part of a dup content detection code.
I find it best to avoid using the same, or extremely similar meta descriptions.
Example:
widgets/blue is a subdirectory of all your blue widgets (you have a different directory for the other colors)
If you put:
"Our site specializes in blue widgets, here you can buy one"
into EVERY meta descrip. on every page in this directory, you may find these pages turning supplemental.
Even a slight change is something to make me worry, like:
"Our site specializes in blue widgets, here you can buy a special blue widget" the word special would be differnt on every page. This is still too close for m comfort.
A better one would be:
"special blue widget (something unique, size, diminsions or something)"
Keep all meta descriptions different but as descriptive as possible. Don't write it for the search engines, write it for the viewer. Read the description yourself and honestly ask; "If I were looking for blue widgets would this description prompt me to enter the site?"
I think google also uses meta descriptions as one part of a dup content detection code.
The same hundreds of pages are 95% exactly the same text as each other, with only slight variations in the url and some links at the bottom of the page. What this particular site establishes is that much of what some of the so-called experts here have been saying in other threads about what they think constitutes duplicate content is misleading and unfounded.
Or, what it establishes is that G obviously and blatantly lets some high PR sites get away with things it would not allow in a lower PR site.
There is obviously some kind of percentage threshold that if you cross it, will get the dup content penalty. But if this particular site is any indication, that threshold is set pretty high, and does not include duplicate title and/or description metatags.
as one part of a dup content detection code
I said it was likely "one part" of the code, NOT that every time you dup metas you get hit with a penalty...
Your competitor with identical meta tags might have other factors. For instance, the pages might be updated more often, or other things.
I personally think your competitor would be better off if he could figure out how to make all those meta descriptions different.
Think of it as one part of fine tuning a page.
I think google also uses meta descriptions as one part of a dup content detection code.
Nope.
But you've already rephrased while i was typing. This way i think i can agree ;)
If there are no unique meta descriptions, G omits URLs after first two as the rest being "similar", but doesn't see them as "same".
The two are very different.
Omitted results rank all the same.
Pages becoming supplemental because of duplicate content don't rank at all. :P
Otherwise everything you say is quite what i'd do.
Descriptive and unique. Mm.
If for one thing then for the snippets or sniplets or whatever it's called, the little text that can scare anyone away if it's in broken english pretending to be a sentence. Unless it's something like an inventory ID... if the words don't add up to something even remote to a description, it will make everyone click the next one on the list.
[edited by: tedster at 2:25 am (utc) on Nov. 27, 2006]
And by not using meta description, if Google picks up the global navigation or breadcrumb navigation as the first text, you will have to click to see more results that were very "similar" with a site: search.
It does make a difference. Use meta description unique for each page wherever at all possible.
However, my statement is still provably accurate for the urls examined. Because of the vertical and the way their site is generated, they will always have many hundreds or even thousands of urls in supplementals at any one time, mostly due to 404s on pages that are auto-generated nightly and contain transient content (specific items for sale as of that day) that may be good for only a day or two. When that item is no longer available, the page goes away, but G may index this site daily and quickly for just that reason.
The current algo is not just a checklist or a set of push buttons. It's a lot more like measuring the interaction for a set of profiles -- a relevance profile, a trust profile, and a quality profile, plus a historical profile for each of those areas.
auto-generated nightly and contain transient
That's what I thought. Since dup metas are only "part" of the dup content equation, he gets by because of frequent updates. Possibly other factors as well.
I have a PR7 site, with lots of PR6 and PR5 pages in subdirectories, and "some" of the metas are similar. I did have it where the metas were identical in some subdirectories, and I did notice a rise in the serps when I altered them, so there is something to the dup meta description theory. But like I said twice before, it is only just one part of the equation, and it may be a small part. But hey, every little bit helps right?
Therefore, my little PR3 160 page site, with a few pages that change frequently, but lots more with stable and unique content, and its searchable 50,000 item database of the same inventory offered by my competitors, could be penalized, or at least have some pages dropped totally from the index or sent to supplemental, for even a minor infraction, like a few near duplicate description metatags. While a huge and high PR auto-generated site, whose thousands of pages change frequently because of inventory turnover and/or just a fresh "date last generated" on each of its pages, can get away with the same infraction for hundreds of its pages.
I think I see where my next programming efforts will go, since the inventory of the site in question is also my inventory. I'll just have to spam the SEs, within the established guidelines, with thousands of the same type of auto-generated content pages, but with unique meta descriptions and titles. I don't like it, but if that's the only way to get somewhere on page 1 these days, and the top 10-15 sites in the SERPs for my vertical all do it already and get away with it, what's a webmaster and small business person to do?
webmaster and small business person to do?
Work your fanny off! I feel your pain, relatively speaking I might be doing "well" but in the grand scheme of things I am a little fish too.
Look on the bright side, you are working for yourself, adding information or products the whole entire world can see, and nobody can fire you...
Gotta love being a webmaster.