homepage Welcome to WebmasterWorld Guest from 54.166.105.24
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Experiment Results: Do Meta Keywords Matter?
grant

10+ Year Member



 
Msg#: 3138562 posted 3:21 am on Oct 29, 2006 (gmt 0)

In mid-June, I began an experiment to test whether Meta keywords have any impact. I went into the experiment believing that Meta KW do have some impact on relevance, but I felt compelled to try this test anyway.

I am posting this in Google Search News because I was primarily interested in the test results on Google.

Additionally, I want to explain the current results of the test and get feedback from folks as to how I can tweak the experiment given the Google results explained below.

First, let me explain what I did:

1. I made up a word for which no search results were found (on Google, Yahoo, MSN, and Ask)
2. I created a new web page on one of my sites. The word I made up was placed only in the Meta KW
3. I linked to the new page from my site map with "test" in the link text

Using Google Webmaster Tools, I was able to follow when the page was indexed (late June). A few weeks ago, the PR update gave the page PR2 (fwiw).

On Wednesday, I searched for my test KW. Google showed no results, Yahoo DID find my page. MSN and Ask have no results.

Searching for the test URL on G resulted in "(test) URL did not match any documents." However site:URL showed the test page in supplemental results.

I watched Matt Cutts' video blog on supplemental. In it, he explains the "beaten path". I'm not going to paraphrase what he says (if you want to view it, just search Google video for 'Matt Cutts supplemental').

GoogleGuy has also written:
"the supplemental results are a new experimental feature to augment the results for obscure queries. This is a new technology that can return more results for queries that for example have a small number of results. So it might not affect the results for a popular search, but for a researcher doing a more specific query, it can improve the recall of the results. The supplemental collection of pages has been collected from the web just like the 3.3 billion pages in Google's main index."

At this point, I feel like Google sort of "failed" the test. The problem with this test can be alikened to trying to prove the existence of God. You can only prove existence, not inexistence.

So there are explanations, including:
1. Meta KW don't matter on Google (God doesn't exist)
2. Meta KW do have an impact on relevance, but in this example, not enough to return a positive result
3. The supplemental result status is inhibiting the test

I refined my experiment so that a page in the main index has my test word in the Meta KW (in case the supplemental status is indeed the culprit).

If, as Google claims, supplemental results are for "obscure" queries, then I'd like to refine my test to explore WHY a supplemental result with an obscure Meta KW returns no results.

If my page in the main index returns a positive result, then I'll know Google does score Meta KW. If this indeed happens, I will, as mentioned, focus the experiment on questions relating to supplemental.

What I find confusing is that if Google has supplemental results for "obscure" queries, what does it take to create a positive return in supplemental?

I'd like to evolve the experiment so as to compare the test page in the main index against the supplemental. What I anticipate might happen is that the test KW will need more presence on the page than the main index.

Suggestions for next steps in this test are welcome.

 

jtbell

10+ Year Member



 
Msg#: 3138562 posted 5:15 am on Oct 29, 2006 (gmt 0)

I refined my experiment so that a page in the main index has my test word in the Meta KW

You just reminded me that I started a similar experiment over a month ago, and never got around to checking the result.

My home page has PR5 and gets crawled at least a couple of times per week. Until this past September 11, it had no meta keywords tag at all. On that date I added a meta keywords tag containing three gibberish "keywords" of my own invention, each of which returned only a few hits in Google searches.

Just now, a month and a half later, I Googled those keywords again. My page doesn't show up in any of the search results.

jomaxx

WebmasterWorld Senior Member jomaxx us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3138562 posted 5:40 am on Oct 29, 2006 (gmt 0)

I think it's been well established for years that Google does pay some attention to the meta description tag, but none to the meta keywords tag. I fact I did a similar experiment to yours some time ago and got the same result.

The reason is that Google don't want you telling them what keywords to rank the page for, they want to index pages based on what the user sees.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3138562 posted 6:28 pm on Oct 29, 2006 (gmt 0)

>> "the supplemental results are a new experimental feature to augment the results for obscure queries. This is a new technology that can return more results for queries that for example have a small number of results. <<

Those extra results are of several main types:
- Many are simply for any URLs that are duplicate content of the stuff already listed as normal results.
- They are also for URLs that have been redirecting, or are 404, or have domains that have expired sometime in the last year or so, and which have old content that matched your search term.
- They are also for URLs where Google has stored the current page content as a normal result and the previous content of the page as a Supplemental result.
- The last type are pages that have been deemed "unimportant" as they have low PR, few inbound links, and live somewhere on the periphery of the web.

The first three types of Supplemental Results allow you to see old content, content that no longer exists live on the web, via the Google cache.

petehall

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3138562 posted 7:05 pm on Oct 29, 2006 (gmt 0)

I have to ask why you didn't just add this keyword to your home page?

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3138562 posted 7:25 pm on Oct 29, 2006 (gmt 0)

I think I can field that question -- the purpose was to test the meta keywords element as a ranking algorithm factor. If you also add the keyword to the home page, then you could not be sure that the meta keywords occurance was a deciding factor when the word did show up with a SERP.

HarryM

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3138562 posted 7:32 pm on Oct 29, 2006 (gmt 0)

If a unique meta tag keyword does not show up in serps, this does not necessarily mean that Google is ignoring the meta tag. Google may be discounting keywords that are in the meta tag but not on the page.

texasville

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3138562 posted 7:42 pm on Oct 29, 2006 (gmt 0)

tedster is right on.

Also:
>>>>>If, as Google claims, supplemental results are for "obscure" queries, then I'd like to refine my test to explore WHY a supplemental result with an obscure Meta KW returns no results.<<<<<<

This might be true for pages that now 404 or in the supplemental for duplicate content or titles and meta tags, there is also a new classification for supplemental. It is the url with no or very insignificant ibl's.
These url's will no longer be returned in any search. Even obscure ones. The images on those pages will also be eliminated from the google index.
These urls (pages) are in effect banned from google.

photopassjapan

5+ Year Member



 
Msg#: 3138562 posted 11:35 pm on Oct 29, 2006 (gmt 0)

On the issue of meta keywords: they don't count.
Right now... they don't ;)

Does it hurt to add them?
Did it hurt to add description tags for every page since 2003?
It didn't hurt US... just add them, and forget about them.

<part of message moved to new thread [webmasterworld.com]>

[edited by: tedster at 12:08 am (utc) on Oct. 30, 2006]

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3138562 posted 12:12 am on Oct 30, 2006 (gmt 0)

In addition to the fact that many webmasters were spamming the meta keywords element in years gone by, it also has been abused by creating just one tag and using it on every url in the domain.

If you want to use it -- and I agree with photopassjapan that it's a good idea -- then create a unique and relevant keywords meta tag for each url, not a universal and sitewide mynah bird. It's a great place to leave yourself a quick note about what keywords you hope the url will rank for. You just might be all prepared for a future sea change.

Jambo_ME

10+ Year Member



 
Msg#: 3138562 posted 4:31 am on Nov 1, 2006 (gmt 0)

It can hurt, in that its a line of code that contributes nothing to your page, and decreases the percent of readable text to other stuff.

Its like asking if JavaScript will hurt your rankings.

I would like to see the day when that tag becomes relevant again! Perhaps in closed systems, institutional documents and such.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3138562 posted 4:58 am on Nov 1, 2006 (gmt 0)

I would like to see the day when that tag becomes relevant again!

From various conversations I've had with search engineers, almost anything in an html document may become part of the ranking algorithm if that element begins to show good signals of relevance across a large number of domains and pages. From the way the meta keyword tag is neglected and misused currently, that is probably a long way off in the future.

Its like asking if JavaScript will hurt your rankings.

I have a very interesting case in front of me -- a major commercial domain that launched a redeveloped site four weeks back or so. Despite our best efforts, the file sizes are DOUBLE what they were (now over 100kb of html), and there is a lot of javascript in the page (20-25 kb). Not exactly in the "sweet spot"!

Traditional SEO has long said that large file size and inline script is bad news. Well, maybe -- it certainly used to look like that. I will definitely continue to work with the management to bring those high file size numbers down, for many reasons. However, to my amazement, these urls have moved UP the SERP since launch of the new code. This is in spite of the fact that the new file sizes are at least twice that of their competitors in the top ten.

This is a very startling result to me, but I think it shows that Google is getting better at ignoring sections of code that are not truly relevant, and ranking a URL based more on content that the user sees. Certainly file size, on its own, would not be a relevance factor, right?

I will be following this case quite closely, I can promise. Maybe it's time to take another look at accepted SEO wisdom.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3138562 posted 12:39 pm on Nov 1, 2006 (gmt 0)

I always try to get Javascript off into external files, not just for getting the HTML file size down, but for caching the javascript file and loading it once per visitor instead of once per page view.

photopassjapan

5+ Year Member



 
Msg#: 3138562 posted 12:47 pm on Nov 1, 2006 (gmt 0)

It can hurt, in that its a line of code that contributes nothing to your page, and decreases the percent of readable text to other stuff.
Its like asking if JavaScript will hurt your rankings.

Javasript hurts rankings indirectly because of hiding/cloaking information and links, making sites unusable with older browsers and so on ( accessibility ). While i agree with having too much code in front of the content will bore the hell out of G bot, i'd say meta keywords are not the typical turn off for it. Ever since G becoming confident with its bandwidth, a page with more than 2k of code before its content won't necessarilly raise the red flag.

Besides...
If you want useless code that definately has nothing to do with your rankings right now, moreover... perhaps it never DID have anything to do with it...
Here are some examples on what you'd see as extra in the header of the top 10 sites in a search we'd like to be top for :)

( these lines are featured on and off on every page, not just in the top ten... that's why i know they're useless )

<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<meta name="copyright" content="Long long long copyright notice">
<meta name="revisit-after" content="2 days">
<meta name="robots" content="index,follow">
<meta name="reply-to" content="email of the website creator! wow">
<meta name="distribution" content="global">
<meta name="language" content="af">
<meta name="classification" content="widgets">
<meta name="Content-Type" content="text/html; charset=windows-1252">
<meta name="author" content="The author of the website">
<meta name="pd" content="Thursday, 18-Oct-2006 10:54:33 GMT">

And before you point out that the codepage is needed, i'd like to note that on a particular site, that line is a duplicate, and with DIFFERENT information than the previous line. It's no.1 though so who cares, right?

From various conversations I've had with search engineers, almost anything in an html document may become part of the ranking algorithm if that element begins to show good signals of relevance across a large number of domains and pages. From the way the meta keyword tag is neglected and misused currently, that is probably a long way off in the future.

While i see the point, there's a very recent and prominent example to think otherwise.

The Meta Description tag.

Google insisted for years that it's useless, irrelevant, too easy to manipulate, obsolete, can be used but won't count anyways, and so on. ( we used them nonetheless )

How many sites used it at the moment of them launching the regrouping program in their index? How many used them unique to each page?

And this year it becomes a major factor?

I'd say it became a factor because Google NEVER paid attention to it, and thus it wasn't used by the majority of sites that were planned with ranking in G on mind. Get it? It was used because G needs as many factors as it can accumlate so that the crossfire of its filters may allow it to function half a year more, everytime. For within that time, those who want to, will find a way to spam the hell out of the new parameter as well.

Meta desctiption is a good example, people rarely used it, for they wanted their pages lighter, and G always have insisted that it doesn't count. I particularly remember the rethoric of calling it obsolete. Now it's a NEW factor...

One which may not elevate you to top 3 positions, but if it's not used, or used incorrectly, it can send your lower pagerank pages off the map, or group all the results your site is relevant for ( please note i'm saying SITE ) under the highest level URL ( directory and PR wise ) IF the meta description is the same.

I'm not sure about others but...

Once they changed their minds, we sure didn't get an email from anyone at Google notifying us that "we've updated our algorithms, please use meta descriptions to finetune your on-page relevancy".

Perhaps everyone else got this mail.
We're always left out of everything :-{

;)

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved