Forum Moderators: open
Thank you,
Ryan Allis
On November 15, 2003, the SERPs (Search Engine Result Pages) in Google were dramatically altered. Although Google has been known to go through a reshuffling (appropriately named a Google Dance) every 2 months or so, this 'Dance' seems to be more like a drunken Mexican salsa that its usual conservative fox-trot.
Most likely, you will already know if your web site has been affected. You may have seen a significant drop-off in traffic around Nov. 15. Three of my sites have been hit. While one could understand dropping down a few positions, since November 15, the sites that previously held these rankings are nowhere to be found in the top 10,000 rankings. Such radical repositionings have left many mom-and-pop and small businesses devastated and out of luck for the holiday season. With Google controlling approximately 85% of Internet searches, many businesses are finding a need to lay off workers or rapidly cancel inventory orders. This situation deserves a closer look.
What the Early Research is Showing
From what early research shows, it seems that Google has put into place what has been quickly termed in the industry as an 'Over Optimization Penalty' (OOP) that takes into account the incoming link text and the on-site keyword frequency. If too many sites that link to your site use link text containing a word that is repeated more than a certain number of times on your home page, that page will be assessed the penalty and either demoted to oblivion or removed entirely from the rankings. In a sense Google is penalizing sites for being optimized for the search engines--without any forewarning of a change in policy.
Here is what else we know:
- The OOP is keyword specific, not site specific. Google has selected only certain keywords to apply the OOP for.
- Certain highly competitive keywords have lost many of the listings.
How to Know if Your Site Has Been Penalized
There are a few ways to know if your site has been penalized. The first, mentioned earlier, is if you noticed a significant drop in traffic around the 15th of November you've likely been hit. Here are ways to be sure:
1. Go to google.com. Type in any search term you recall being well-ranked for. See you site logs to see which terms you received search engine traffic from. If your site is nowhere to be found it's likely been penalized.
2. Type in the search term you suspect being penalized for, followed by "-dkjsahfdsaf" (or any other similar gibberish, without the quotes). This will remove the OOP and you should see what your results should be.
3. Or, simply go to www.**** to have this automated for you. Just type in the search term and see quickly what the search engine results would be if the OOP was not in effect. This site, put up less than a week ago, has quickly gained in popularity, becoming one of the 5000 most visited web sites on the Internet in a matter of days.
The Basics of SEO Redefined. Should One De-Optimize?
Search engine optimization consultants such as myself have known for years that the basics of SEO are:
- put your target keyword or keyphrase in your title, meta-tags, and alt-tags
- put your target keyword or keyphrase in an H1 tag near the top of your page
- repeat your keyword or keyphrase 5-10 times throughout the page
- create quality content on your site and update it regularly
- use a site map (linked to from every page) that links to all of your pages
- build lots of relevant links to your site
- ensure that your target keyword or keyphrase is in the link text of your incoming links
Now, however, the best practices for keyword frequency and link text will likely trigger the Google OOP. There is surely no denying that there are many low quality sites have used link farms and spammed blog comments in order to increase their PageRank (Google's measure of site quality) and link popularity. However, a differentiation must be made from these sites and quality sites with dozens or hundreds of pages of informational well-written content that have taken the time to properly build links.
So if you have been affected, what can you do? Should one de-optimize their site, or wait it out? Should one create one site for Google and one for the 'normal engines?' Is this a case of a filter been turned on too tight that Google will fix in a matter of days or something much more?
These are all serious questions that no one seems to have answers to. At this point we recommend making the following changes to your site if, and only if, your rankings seem to have been affected:
1. Contact a few of your link partners via email. Ask them to change the link text so that the keyword you have been penalized for is not in the link text or the keyphrase is in a different order than the order you are penalized for.
2. Open up the page that has been penalized (usually your home page) and reduce the number of times that you have the keyword on your site. Keep the number under 5 times for every 100 words you have on your page.
3. If you are targeting a keyphrase (a multiple-word keyword) reduce the number of times that your page has the target keyphrase in the exact order you are targeting. Mix up the order. For example, if you are targeting "Florida web designer" change this text on your site to "web site designer in florida" and "florida-based web site design services."
It is important to note that these 'de-optimization' steps should only be taken if you know that you have been affected by the Google OOP.
Why did Google do this? There are two possible answers. First, it is possible that Google has simply made an honest (yet very poor) attempt at removing many of the low-quality web sites in their results that had little quality content and received their positions from link farms and spamdexing. The evidence and the search engine results point to another potential answer.
A second theory, which has gained credence in the past days within the industry, is that in preparation for its Initial Public Offering (possibly this Spring), Google has developed a way to increase its revenue. How? By removing many of the sites that are optimized for the search engines on major commerical search terms, thereby increasing the use of its AdWords paid search results (cost-per-click) system. Is this the case? Maybe, maybe not.
Perhaps both of these reasons came into play. Perhaps Google execs thought they could
1) improve the quality of their rankings,
2) remove many of the 'spammy' low-quality sites
3) because of #2, increase AdWords revenues and
4) because of better results and more revenue have a better chance at a successful IPO.
Sadly, for Google, this plan had a detrimental flaw.
What Google Should Do
While there are positives that have come from this OOP filter, the filter needs to be adjusted. Here is what Google should do:
1. Post a communiqué on its web site explaining in as much detail as they are able what they have done and what they are doing to fix it;
2. Reduce the weight of OOP;
3. If the OOP is indeed a static penalty that can only be removed by a human, change it to a dynamic penalty that is analyzed and assessed with each major update; and
4. Establish an appeal process through which site owners which feel they are following all rules and have quality content can have a human (or enlightened spider) review their site and remove the OOP if appropriate.
When this recent update broke on November 15, webmasters clamored in the thousands to the industry forums such as webmasterworld.com. The mis-update was quickly titled "Florida Update 2003" and the initial common wisdom was that Google had made a serious mistake that would be fixed within 3-4 days and everyone should just stay put and wait for Google to 'fix itself.' While the rankings are still dancing, this fix has yet to come. High quality sites with lots of good content that have done everything right are being severely penalized.
If Google does not act quickly, it will soon lose market share and its reputation as the provider of the best search results. With Yahoo's recent acquisition of Inktomi, Alltheweb/FAST, and Altavista, it most likely will soon renege on its deal to serve Google results and may, in the process, create the future "best search engine on the 'net." Google, for now, has gone bananas in its recent meringue, and it may soon be spoiled rotten.
That would be an interesting experiment...add links to your competitors. In the real world that would normally be a bad idea, but if Google thinks it makes you an authority, give it a shot.
For Instance, If you are in the real estate field and your keywords are real estate, that could mean literally anything. Even if it is "real estate + city". It could mean realtors, properties, MLS systems, apartments, mobile homes, commercial, residential, strip malls, etc...
So when a searcher uses a broad term like real estate, instead of looking for exact matches, it send you to the "authority sites" to help you narrow down your search.
So, to become an authority, you would have to know who all the players are, what resources are available and present them to the user.
Thoughts?
[edited by: vbjaeger at 2:13 pm (utc) on Dec. 4, 2003]
And wasn't it just last year that Arthur Anderson Co., who emphatically denied any allegations that their consulting business unit had any influence on its auditing business unit?
Please, Mr. Rosing...you can fool some of the people some of the time...
My guess is that if you have not been penalized by the Florida update, then your web pages may appear on more SERP's for stemmed versions of your important keywords. In turn that is likely to lead to slightly more traffic.
Has anyone else noticed this phenomenon?
Good old Beeb - information for the masses, and you can't buy shares in them
...heck, I've just searched for the BBC and got a lingerie store in Bolivia ;)
In message 87 (last night) I made the case against stemming being involved because my site has been dropped for my main Keyword which happens to be a four letter acronym. Let's call it PQRS. My problem was how do you stem an acronym?
I have now noticed the site now in the number one position for the same search also uses an acronym, but that this is just PQR. On the successful page there is a reference to PQRs, i.e. the plural of the acronym PQR. Can some of you experts out there help me decide if this is significant?
(Incidentally the new results are largely totally irrelevant to SPQR providing more evidence that Google has botched it bigtime.)
[edited by: BallochBD at 2:18 pm (utc) on Dec. 4, 2003]
And since when is "shelving" not relevant to libraries? If anything, the current SERP seems too heavily weighted toward e-commerce sites. Ask a vague question...
Shelving
General Information . . .
N/A
Procurement Procedure . . .
To find the best way to procure the goods or services in this category, please answer each of the following questions by clicking [yes] or [no].
Coming Soon
I really wish every article you read in the mainstream press would stop talking about "monthly updates" and the "Googledance"... there are no monthly updates anymore, not since June and no more dance either. The update rolls constantly. Through florida, the updating of my site continued apace with pages being found and added frequently.
And is just me, or does every article on the recent change seem to have gotten its info from a brief reading of posts at WW?
<edit>clarity</edit>
since when is "shelving" not relevant to libraries
Hee Hee! Libraries certainly contain shelving, but in the same way, all buses contain seating. Would you expect a search for seating to show up sites about buses?
(In fact the Google results for 'seating' look rather good)
And if you wanted to purchase shelving, why not enter "buy shelving"? You are thinking along the line of people who tend to buy things off the Internet. Many users don't. And, if they do the current SERP has *plenty* of sellers. It isn't like all the merchants were filtered.
Now link that to an observation I made in a previous post that the SERPS for every affected key word phrase that I have tried doing a normal Google search are the same SERPS I get when I then do that search in the Google Directory. I can predict which keywords are affected this way because at the top of the SERPS category1> catergory 2> > > shows up only on the key words I find to be affected.
This bug (or feature, however you look at it), has been there for a like a couple of months. Do a site search, incase you interested in those threads.
Quite possibly. I'd certainly never purchase shelving over the Net.
>rfgdxm1, Let 'em complain for all they want to. But, they gotta understand that it's time to deal with it rather than venting empty threats/accusations and conspiracies at G.
True. What surprises me is that the "shelving" SERP has e-commerce sites 8 of the top 10. Somebody looking to buy has lots of choices. Now, if 0 of the 10 were e-commerce sites, that would seem seriously skewed.
:)
Hilltop:
- i remember this paper being thrown into the discussion about the localrank patent. It's a similar idea, but not quite the same - the author Krishna Bharat was also on the localrank patent btw.
>> first compute a list of the most relevant experts on the query topic
aka. the Google directory? The acid test for such algos would be how much of the complicated work could be done in advance (regular batch jobs) in order to avoid sacrificing efficiency. On this issue, the line of thought is similar to the localrank algo, ie. some inbound links are valued higher than others. I believe this particular issue has been in effect for some time, eg. with dmoz links.
markis00: Why are some keywords penalized, while others are not?
- i think the key lies in a combination of the (slightly messed up) directory and that "stemming rules" are not present for every single word in the universe - not even every single English word.
Hissingsid: avoid links to web spammers or "bad neighborhoods"
Sometimes this is true, other times it isn't - perhaps it depends on if Google sees the same neighborhoods as bad or not. OTOH, links change constantly, and if you have any amount of outbound links you should check them often to see if they're still pointing to where you think they are.
to get more sensible results, the user should enter "shelving -waffle"
One more jumped on the bandwaggon. Shelving is just a word like any other word - one that can be used in a lot of combinations with a lot of intentions - if you want an exact search, then do an exact search. Please let's not go there again - it's dead, gone, buried.
/claus
If I want rock solid information on shelving, the best source of information will be shelving vendors
To you, your customers, and your industry yes. BUT not necessarily to joe public and the internet at large. Bob Villa or the national association of libraries might be a better source of shelving information than a vendor.
Believe me when I say I understand what your saying, but somehow, someway, what rfgdxm1 is saying suddenly makes more sense. The results are not perfect yet, but when a searcher types in "shelving", Google is responding with "what kind of shelving?" and giving you "authority" sites that might help you narrow it down a bit.
We've got so used to the 'searchword -waffle' effect that we're in danger of forgetting how wierd it is. If you can provide a watertight explanation for it, then I agree that it can finally be put to bed. But you can't just dismiss it as 'old news'
-i find myself doing Googles job here, but anyway, here it is:
You can increase the accuracy of your searches by adding operators that fine-tune your keywords.
(...)
Sometimes what you're searching for has more than one meaning; "bass" can refer to fishing or music. You can exclude a word from your search by putting a minus sign ("-") immediately in front of the term you want to avoid. (Be sure to include a space before the minus sign.)
LINK: [google.com...]
/claus
What surprises me is that someone thinks 2 out of 10 not being e-commerce sites is a problem.
And when you add in the 10 AdWords ads relating to several different major types of shelving, it really makes you wonder what the complaint is.
I would guess the customer (searcher) is being served in this case - especially for such a general keyword.
seasalt
Thank you for explaining the basics of Google and the - operator, most enlightening ;)
Now would you care to explain the strange effects of using a meaningless subtractive term on the SERPs? (this was my original question ;)