| 7:58 pm on Mar 11, 2007 (gmt 0)|
do you want to say google is ranking your site high on the basis of description tag?
or explanation of your website has been taken from there
| 2:58 am on Mar 12, 2007 (gmt 0)|
well it's a new site........but it's on page one already for a good phrase...and the description comes from the meta content tag.
so i'm saying the meta tags are important as one of the factors..
| 6:34 pm on Mar 12, 2007 (gmt 0)|
how new is the site?
<.02>I have seen this on a new site before, but I believe that fades after time. I doubt it'll still be like that after 3 months. </.02>
| 10:17 pm on Mar 12, 2007 (gmt 0)|
...my questions would be,
How competitive is that query.
Do you have any inbounds to that page with the given phrase / words that are parts of it.
Is the phrase or parts of the phrase included in the title / body?
Is it in the URL?
I know META is important, but I'm yet to see a page rank well for a competitive phrase that's in the description tag, but nowhere else.
| 10:44 pm on Mar 12, 2007 (gmt 0)|
I suspect it's a combination of meta tags and on-page factors.
As I mentioned in another thread about the latest Google update, I've lost my #1 spots for a few pages, but very important pages.
When I do a Google search for my keyphrases, my meta description comes up. If I check the meta description tags for the sites that are now doing better than my site, I find that their descriptions emphasize the keywords and phrases more than mine do.
Once this latest Google dance is over, I'm going to re-write the meta descriptions for a lot of pages on my site.
| 10:50 pm on Mar 12, 2007 (gmt 0)|
66sore. I think you might be begging the question a bit in your reasoning. What you're basically saying is...
"My page ranks well for a search term. Google uses my meta description as the snippet. Therefore, the meta description is important to Google in ranking the SERPs."
Just because the first two sentences are true, it doesn't necessarily follow that the third is. Though a solid meta description is necessary (mostly to disambiguate possible duplicate pages), a very general consensus is that Google gives little, if any, weight to the meta description in its ranking algorithm.
But, it can use it in the snippet. As far as I know Google uses three types of snippets in different situations: it might use one generated from the text on the page; it might use the DMOZ description (if applicable); or if the search phrase matches a string phrase in the description, it can use the meta description.
So, it just looks like in this instance G decided to use the meta description.
| 7:20 pm on Mar 13, 2007 (gmt 0)|
One site of mine has 500+ pages (per Google). By Feb. 1, the site was shown at Google as being one page with 499+ supplementals. I had the meta description the same on every page. I changed it to make unique descriptions for every page. Over the past month and a half, the pages are slowly returning to normal. At this point about 100 are non-supplemental. (How long does this process take, does anyone know?)
Some of the supplementals show the new, non repetitive meta descriptions. Some show the old, repetitive meta descriptions. The supplementals that show a new, non-repetitive meta description have a cache of the page from December.
Does this mean that the top portion of pages (where meta tags are located) are cached separately and more often than the rest of the page? Or does it just mean that the cache Google displays to the world is older than the cache Google actually uses?
Another thing i noticed is if I mess up a site and it is off-line completely, Google seems to ignore that for a time ("time" meaning at least several weeks) and list the site the way it was listed in the past versus showing a 'file not found' status. However, if I have a "parse error", the pages are delisted within days. Is the "parse error" viewed as duplicated content, while the 'file not found' is treated more sympathetically?
| 7:45 pm on Mar 13, 2007 (gmt 0)|
|How long does this process take, does anyone know? |
Some pages will work themselves out of supplemental status faster than others. Much is affected by the PR of each page as that is a (the prime?) factor used by Google in determing how often to spider a page.
|Does this mean that the top portion of pages (where meta tags are located) are cached separately and more often than the rest of the page? Or does it just mean that the cache Google displays to the world is older than the cache Google actually uses? |
Well, for the first part, since Google uses the title element and meta description to filter possible dupe pages on the fly, it would certainly make sense for that data to be kept in a different bucket for speed purposes if nothing else.
For the second part, well, there's always a lot of anomalies seen in the Google cache. I tend to chalk them up to different data sets being pushed at different times. It sounds simplistic, but sometimes simple is best (especially for an already confused mind like mine).
| 8:32 pm on Mar 13, 2007 (gmt 0)|
Thanks for the feedback!
"Some pages will work themselves out of supplemental status faster than others. Much is affected by the PR of each page as that is a (the prime?) factor used by Google in determing how often to spider a page"
One site i had taken off-line inadvertently (moved to a new web host and forgot to update the dns - talk about simple minds...) During that time, pages were reappearing in the index, coming out of supplemental status. It seems like their coming back into the Google results has less to do with the PR of the individual pages and more to do with -- maybe -- some sort of dial that is set for PR of the site, ie "for a PR4 site, no more than x pages will be added per day". However, 15 were added in Feb and so far about 85 in March (for the original site I was talking about, not the one taken offline accidentally). I'm hoping the more pages in the main index the faster the rest begin coming back in.
For the one that i took offline accidentally, that has only been 7 pages made "unsupplemental". Could the anomoly part that you're referring to be that it is going to come in and out of having most pages be deemed supplemental? (...That the 7 pages just reflects different data centers rather than my efforts to reduce supplementals?)
| 1:53 am on Mar 15, 2007 (gmt 0)|
to whoever says meta-description is important..... I guess you've never noticed wikipedia at the top of the serps?
| 5:15 am on Mar 15, 2007 (gmt 0)|
The way I see it, meta description can be very helpful, exactly as jimbeetle described so well above. For new domains or low PR pages, meta description is a factor that I would definitely pay attention to. But of course "helpful" is different from "essential" or from "always important."
I've said this before, I know, but we're moving out of a simple black and white world with Google. The fuzziness of the situation makes overly facile analysis a dangerous practice.
| 7:49 am on Mar 15, 2007 (gmt 0)|
My site is not new (it's been online since the '90s) but does have low PR. (That's "low" to the rest of you. I'm proud of my PR4!) PR has remained the same over the years, yet this issue, of the smallest things affecting whether or not 2 pages are viewed as duplicates of each other, is new.
Opting in to Google's "enhanced image search" seems to have helped, for anyone with an image site who is having these problems.