Forum Moderators: open
Kackle - can you explain the "dictionary" for me? And how I might benefit from it - Im reading your posts hard but dont see where youre coming from.
Sure. But you have to act quickly. Google will fix this one just like they fixed the hyphen.
1. Google is depreciating pages/sites that are over-optimized for certain keywords or keyword combinations. It does this by looking up search terms in a dictionary of target keywords or keyword pairs that it has compiled. This dictionary is Top Secret, because if you knew what was in the dictionary, you could avoid these words in your optimization efforts.
2. If the search term or terms hit on a dictionary entry, the search results for that user's search are flagged. This means that before the results are delivered, the order of the links, or even the inclusion of links, are adjusted so as to penalize pages that have overoptimizated for those terms. Most likely the title, headlines, links and anchor text are examined. It's possible that external anchor text pointing to that page has also been pre-collected and is available for scanning, but this is much less likely. (Besides, external links are not something within your immediate control, so don't worry about it right now.)
3. You want to find out which keywords that are relevant to your site are in Google's dictionary. Compile as many relevant keywords you can think of that searchers might use to find your site. Now take these words singly and in pairs, according to how users might search. Run two searches for each combination and compare the results.
4. If the results are strikingly different for the pre-filter and the post-filter search on a particular term or combination of terms, it means that some variation of those terms has been flagged because something was found in Google's dictionary.
5. Do lots of searches and you can come up with a list of "sensitive" words that you'll want to avoid when you re-optimize your pages.
It's a nice weekend project.
Is it possible that Google wants to change what is required to rank well on their site to nearly polar opposites from every other site so that if you are optimized for Google you are not optimized for any other SE. Since Google gets most of the traffic at this point in time most people would optimize for Google. Therefore in a few months all other SE's will return results pretty much just like what Google has now, making Googles ground as #1 more solid.
If removing keyword density from title, h's, etc. is required to rank well in Google that is what seems like will happen to me. I am ranked on the first page for my targeted terms on all other SE's and used to be on Google a week ago.
As of right now the other guys might have better results, but they just aren't able to keep them as up to date. Google has pages I added last week, the others might have pages from last month.
I think Kackle's point more precisely is non-commercial search phrases as opposed to non-commercial sites. So the question is - are your index pages missing for search phrases that have commercial connotations?
I wouldn't call those keywords "commercial"--they're the names of specific places with no accompanying words. Since we aren't supposed to use real-life examples here, let's just say they're "Cityname," "Countryname1," and "Countryname2."
[edited by: europeforvisitors at 3:38 am (utc) on Nov. 24, 2003]
no idea what rfgdxm1 is looking at but i think most are in agreement that google is screwed, including me.
That is most likley because only those that are unhappy with the changes are posting. The silent, perhaps majority, are too busy adding more content and out there getting links.
It's kind of like what politicians do, preach to the converted :o)
Dave
I see very few top ranked commercial sites using either an H1 or H2 tag. And when they do, they don't contain any of the search terms.
GuinessGuy,
I do see that many sites which do not use H1 tags are ranking highly after the florida massacre, you are correct about that, and many sites which use H1 tags have been nuked during the Florida update.
But I wonder if this following theory might be correct...
I have confirmed that many sites (which I am monitoring right now) are ranking very high using the "money phrase" inside their H1 tags exactly, even after the Florida update.
Yet, as you mentioned many sites using H1 tags were nuked by the Florida update...
Therefore, I would draw the conclusion that it's not the presence of H1 tags which forced the other sites to get nuked (and it wasn't the absence of H1 tags which caused the "non H1 tag pages" to rise up and occupy the top ten spots after the Florida update was over)...
I would draw the conclusion that the H1 pages which dropped (nuked pages which used H1 tags) had other SEO traits which got them flagged as spam and penalized.
I would draw the conclusion that if a page uses their money word inside the H1 tags then that page is more likely than not to also use keyword stuffing (or use a slight over-density of keywords at the very least), therefore leaving those pages penalized by google for the other SEO traits (but not penalized because of the H1 tags specifically).
If H1 tags were being penalized directly then there would be no way for these other pages to rank #1 (for popular "money phrases") while using that same money phrase inside their H1 tags.
Google is nuking pages with "SEO penalties" and sending them to the dark side of the huge black hole which sits in the center of the milky way galaxy, therefore if the H1 tag was a part of the penalty then I would think that all pages with that tag would be nuked and obliterated. I just don't think the H1 tag is a direct target, but I do agree that many pages using the H1 tag were penalized (I just think they were penalized for other SEO factors).
I am not saying my theory is right, maybe it's not, it's just a theory.
[edited by: Brenda_J at 3:55 am (utc) on Nov. 24, 2003]
The silent, perhaps majority, are too busy adding more content and out there getting links.
I should be more silent, but it's hard not to get drawn into these threads.... :-)
I got 8 new pages online in the last few days, field notes, muchos content, and 5 were picked up by googlebot yesterday.
Sorry that some of the commercial people are having problems, but Google still works for information purposes the same as it did two weeks ago... not bad.
I'll quote myself from about 1000 posts earlier in these update threads, "If you're selling the same stuff as a million other people, you might have problems". If the competition is that fierce, you're selling the wrong stuff.
Yep. One thing to be considered is the sample of people who post at WebmasterWorld tends to be *very* different than the average punter using Google.
One thing to be considered is the sample of people who post at WebmasterWorld tends to be *very* different than the average punter using Google.
So very true which is why the SERPs are most likley still fine (or better) for 95%+ of the world.
Dave
Now say what you want, but this is not an improvement in results over the previous ones and I will bet that if it is happening in my niche it is also happening in many others.
Google needs to address some sort of issue here, because there clearly is a problem or we wouldn't be approaching part 5 of this wonderful thread.
BTW. The toolbar search tool, froogle and all the other meaningless toys are really cute but Google needs to realize that they don't mean squat when you plan on providing results such as Google has been in the past week!
Non-commercial info webmasters just don't spam using cloaking much. And most commercial SERPs aren't that competitive that anyone would do such a thing.
My impression is that this is a bug, not a filter, and that google will eventually fix the problem.
One thing to be considered is the sample of people who post at WebmasterWorld tends to be *very* different than the average punter using Google.
So very true which is why the SERPs are most likley still fine (or better) for 95%+ of the world.
...the "message" implied in those above statements is that webmasters (here at webmaster world) only view google relevant if our own sites are ranked highly, and that we are going out of our way to search for a few obsure phrases which show poor SERPS as a way to whine about our nuked rankings. Well, check this out...
Relevancy is in the eye of the beholder, but there are certain universal truths which are obvious...
Stability = Good
...and one universal truth is that Google is now using a Broad Based search algo, and it disfavors an exact match. In fact, it penalizes exact matches and buries them on purpose.
This causes results to be "great" (relevant) for some industries and horribly irrelevant for other industries, and the results are UNSTABLE and unpredictable, and the results are totally DEPENDENT ON THE INDUSTRY you are searching for. That's the key point you are missing.
The fact that Google's algo is producing good results for some industries and horrible results for others is bad because google needs to find an algo which is more stable and not dependent on the industry you are searching for.
Stable results which are not dependent on the particular Industry you are searching for is a good thing (nobody can argue that)
Unstable results which are dependent on the Industry is a bad thing overall for google, for everybody.
If you pick one indsutry the results may be great based on this new algo, but the results for many other industries are horrible and not logical. This cannot be argued, the new algo produces both good and bad results, and it distributes them about equally. Just because you are happy with your own industry's results doesn't mean you can say the whole web is happy with google's new algo but only a few disgruntled webmasters are unhappy.
You have to look at things with a more open mind.
[edited by: Brenda_J at 4:31 am (utc) on Nov. 24, 2003]
<edit>Sorry, didn't know that word was unuseable</edit>
[edited by: Stefan at 4:26 am (utc) on Nov. 24, 2003]
You are using a *extremely small* % of all searches to make judgement. How can one voice (or even hundreds) be used as a measurement of good SERPs?
The only SEO I have ever done and probably ever will, is add linked content pages every day and set up my site to try and guide humans (not robots) to the areas where they can spend.
Forget SEO (it will bite you on the bum and get you in trouble) focus on optimizing for humans and Google will send you LOTS of traffic. Just remember that Googles ultimate aim is to return perfect results for humans, not robots.
Dave
And this means what to someone searching for the population of Russia? Most people on the Internet don't own commercial websites or are SEOs.
i've brought this up before, and was told "it was always that way", but i'll bring it up again with some extra facts.
Pre-Florida, when searching my particular lost phrase, i was able to scroll through several hundred results (out of 780,000) before hitting the 'repeat the search with the omitted results included' message. I don't remember the exact amount, but it was at least 5 pages worth (i have my google setting set to show 100 results per page).
During this update, i get that message after only approximately 131 results. (i say approximately because it varies slightly from dc to dc - always less than 140).
However, on google.de, (where Pre-Florida results still exist for my search term; hence i am not lost there), i can view nearly 600 results before hitting the message.
I keep thinking that this must mean something - i just don't know what. Any thoughts?
You have to look at things with a more open mind.
LOL! Now that really is the pot calling the kettle black.
It is the very fact that I AM viewing results with an open mind that allows me to say the results are no better or worse and Google is still the best search engine for searchers
Dave
And this means what to someone searching for the population of Russia? Most people on the Internet don't own commercial websites or are SEOs.
You are creating "false arguments" and then trying to prove those arguments wrong.
Where in my post did I say that commercial web sites are "relevant" and non commercial web sites are "non relevant"? Please show me that quote;)
When I said "industry" I could have also said "topic area", I assumed people would use common sense but obviously I must be more careful in my wording from now on.
If I search for "the population of russia" then I want to get a site discussing the "population of russia", I don't want to get a site selling russian widgets.
However, any logical person can agree that if I search for "blue metal widgets" and receive spam results redirecting to a site selling "orange martian gophers" then that is not relevant, regardless of whether you are from russia or mars;)
I am talking about common sense and relevant results versus irrelevant results, I am not talking about commercial web sites versus non commercial web sites. See the difference?
[edited by: Brenda_J at 4:41 am (utc) on Nov. 24, 2003]
And I have seen *very* few examples of such. In fact, not a single case where I was the one who chose the search term. Only when people privately contacted me with such an example. What you describe above is a rare exception to the rule.
So very true which is why the SERPs are most likely still fine (or better) for 95%+ of the world.
The problem is that Google is approaching this spam problem in a very ham-handed manner. I'm happy with my Google traffic, and I'd someday like to see noncommercial results only for the organic SERPs, and everyone else forced to buy Adwords.
Nevertheless I can sympathize with those who have taken a hit. I suspect that for every mega-spammer who has been inconvenienced, there are four or five mom-and-pop enterprises that are in danger of going out of business. You may like the results today, but you could be targeted tomorrow.
If Google wanted to fight spammy e-commerce sites in a socially responsible manner, here's what they could have done:
1) Make a public announcement that they planned to do this.
2) Set up search tabs on their main search page. One for "commercial" and one for "informational."
3) Invite public comment and sponsor programming competitions on how to separate one from the other.
4) Phase in the separation between informational and commercial.
5) Work with the Public Interest Registry (which administers the dot-org root servers), librarians, and other nonprofit groups, to define guidelines for information sites. Even if ICANN is too lazy to touch this issue, there are people who feel that .gov, .edu, .org, and the country TLDs that are similar, deserve some search space on the web that is free from commercialism. You might even be able to get grants from foundations or U.N.-connected agencies such as UNESCO to help out on this. The Internet was started by the public sector, and there's no reason we all have to turn into nerdy-Silicon-Valley, dot-com-boom, Segway-scooting libertarian liberals just because Google thinks they know best.
6) Clarify guidelines for commercial sites.
7) Establish an appeal procedure for penalized sites. Google can afford to hire a few ombudsmen. These canned autobot email replies don't cut it.
Instead, what do we see? Google is going after the wrong sites for the wrong reasons, and leaving even worse spam behind. They're doing it secretly and suddenly, leaving us to sink or swim.
There are many better ways to handle the situation.
Google let a certain type of spammers rule the serps for several months. I think JohnLennon wrote "All you need is anchor text..." or something like that.
A day late (or several months actually...) Google has swooped down and actually applied an algorithm to their results, and they are better for it. Unfortunately people here who haven't had their niches awash with duplicate content spam really have no idea of the extent of the problem that the anchor text algo had built, and was building much worse.
In my niche, enough spam to feed Ethopia for a century has been dropkicked off the face of the serps. Quality content sites have benefitted. My sites moved little, but the ones around me now are much higher quality than previously.
The problem was cancerous, even if many folks were unaware of it. Florida isn't a cure, but on the whole it is a positive step. No doubt some innocents have been hurt, and plenty of trash remains, but Google has turned away from the dark side (anchor text has close to zero quality-content valuation) and that's a great thing.
Personally I'm excited about the idea of localrank being applied to this skeleton. A better appreciation of authority within a niche is hopefully the next step.