Forum Moderators: open
Thank you,
Ryan Allis
On November 15, 2003, the SERPs (Search Engine Result Pages) in Google were dramatically altered. Although Google has been known to go through a reshuffling (appropriately named a Google Dance) every 2 months or so, this 'Dance' seems to be more like a drunken Mexican salsa that its usual conservative fox-trot.
Most likely, you will already know if your web site has been affected. You may have seen a significant drop-off in traffic around Nov. 15. Three of my sites have been hit. While one could understand dropping down a few positions, since November 15, the sites that previously held these rankings are nowhere to be found in the top 10,000 rankings. Such radical repositionings have left many mom-and-pop and small businesses devastated and out of luck for the holiday season. With Google controlling approximately 85% of Internet searches, many businesses are finding a need to lay off workers or rapidly cancel inventory orders. This situation deserves a closer look.
What the Early Research is Showing
From what early research shows, it seems that Google has put into place what has been quickly termed in the industry as an 'Over Optimization Penalty' (OOP) that takes into account the incoming link text and the on-site keyword frequency. If too many sites that link to your site use link text containing a word that is repeated more than a certain number of times on your home page, that page will be assessed the penalty and either demoted to oblivion or removed entirely from the rankings. In a sense Google is penalizing sites for being optimized for the search engines--without any forewarning of a change in policy.
Here is what else we know:
- The OOP is keyword specific, not site specific. Google has selected only certain keywords to apply the OOP for.
- Certain highly competitive keywords have lost many of the listings.
How to Know if Your Site Has Been Penalized
There are a few ways to know if your site has been penalized. The first, mentioned earlier, is if you noticed a significant drop in traffic around the 15th of November you've likely been hit. Here are ways to be sure:
1. Go to google.com. Type in any search term you recall being well-ranked for. See you site logs to see which terms you received search engine traffic from. If your site is nowhere to be found it's likely been penalized.
2. Type in the search term you suspect being penalized for, followed by "-dkjsahfdsaf" (or any other similar gibberish, without the quotes). This will remove the OOP and you should see what your results should be.
3. Or, simply go to www.**** to have this automated for you. Just type in the search term and see quickly what the search engine results would be if the OOP was not in effect. This site, put up less than a week ago, has quickly gained in popularity, becoming one of the 5000 most visited web sites on the Internet in a matter of days.
The Basics of SEO Redefined. Should One De-Optimize?
Search engine optimization consultants such as myself have known for years that the basics of SEO are:
- put your target keyword or keyphrase in your title, meta-tags, and alt-tags
- put your target keyword or keyphrase in an H1 tag near the top of your page
- repeat your keyword or keyphrase 5-10 times throughout the page
- create quality content on your site and update it regularly
- use a site map (linked to from every page) that links to all of your pages
- build lots of relevant links to your site
- ensure that your target keyword or keyphrase is in the link text of your incoming links
Now, however, the best practices for keyword frequency and link text will likely trigger the Google OOP. There is surely no denying that there are many low quality sites have used link farms and spammed blog comments in order to increase their PageRank (Google's measure of site quality) and link popularity. However, a differentiation must be made from these sites and quality sites with dozens or hundreds of pages of informational well-written content that have taken the time to properly build links.
So if you have been affected, what can you do? Should one de-optimize their site, or wait it out? Should one create one site for Google and one for the 'normal engines?' Is this a case of a filter been turned on too tight that Google will fix in a matter of days or something much more?
These are all serious questions that no one seems to have answers to. At this point we recommend making the following changes to your site if, and only if, your rankings seem to have been affected:
1. Contact a few of your link partners via email. Ask them to change the link text so that the keyword you have been penalized for is not in the link text or the keyphrase is in a different order than the order you are penalized for.
2. Open up the page that has been penalized (usually your home page) and reduce the number of times that you have the keyword on your site. Keep the number under 5 times for every 100 words you have on your page.
3. If you are targeting a keyphrase (a multiple-word keyword) reduce the number of times that your page has the target keyphrase in the exact order you are targeting. Mix up the order. For example, if you are targeting "Florida web designer" change this text on your site to "web site designer in florida" and "florida-based web site design services."
It is important to note that these 'de-optimization' steps should only be taken if you know that you have been affected by the Google OOP.
Why did Google do this? There are two possible answers. First, it is possible that Google has simply made an honest (yet very poor) attempt at removing many of the low-quality web sites in their results that had little quality content and received their positions from link farms and spamdexing. The evidence and the search engine results point to another potential answer.
A second theory, which has gained credence in the past days within the industry, is that in preparation for its Initial Public Offering (possibly this Spring), Google has developed a way to increase its revenue. How? By removing many of the sites that are optimized for the search engines on major commerical search terms, thereby increasing the use of its AdWords paid search results (cost-per-click) system. Is this the case? Maybe, maybe not.
Perhaps both of these reasons came into play. Perhaps Google execs thought they could
1) improve the quality of their rankings,
2) remove many of the 'spammy' low-quality sites
3) because of #2, increase AdWords revenues and
4) because of better results and more revenue have a better chance at a successful IPO.
Sadly, for Google, this plan had a detrimental flaw.
What Google Should Do
While there are positives that have come from this OOP filter, the filter needs to be adjusted. Here is what Google should do:
1. Post a communiqué on its web site explaining in as much detail as they are able what they have done and what they are doing to fix it;
2. Reduce the weight of OOP;
3. If the OOP is indeed a static penalty that can only be removed by a human, change it to a dynamic penalty that is analyzed and assessed with each major update; and
4. Establish an appeal process through which site owners which feel they are following all rules and have quality content can have a human (or enlightened spider) review their site and remove the OOP if appropriate.
When this recent update broke on November 15, webmasters clamored in the thousands to the industry forums such as webmasterworld.com. The mis-update was quickly titled "Florida Update 2003" and the initial common wisdom was that Google had made a serious mistake that would be fixed within 3-4 days and everyone should just stay put and wait for Google to 'fix itself.' While the rankings are still dancing, this fix has yet to come. High quality sites with lots of good content that have done everything right are being severely penalized.
If Google does not act quickly, it will soon lose market share and its reputation as the provider of the best search results. With Yahoo's recent acquisition of Inktomi, Alltheweb/FAST, and Altavista, it most likely will soon renege on its deal to serve Google results and may, in the process, create the future "best search engine on the 'net." Google, for now, has gone bananas in its recent meringue, and it may soon be spoiled rotten.
And the filter/feature is being handled by a 900lb gorilla, when it should be handled by a surgeon.
>>doing Google's job
I'm sorry about that... i didn't intend to sound arrogant. It was quite literal, meaning that Google should really be expected to explain such things for us webmasters so that we didn't have to go through all that effort to find out for ourselves.
I don't think Google is very good at communicating and explaining what they're doing - perhaps it's for fear of being exploited and that's understandable, but with changes as large as these last ones they should really be posting press releases and/or rewriting their help section. Google should plainly inform me so that i didn't have to use my precious time on trying to understand their business in stead of my own.
For the SEO part of my business (which is only a part, believe it or not), i don't mind their secrecy. I don't want them to disclose information about what makes one site rank higher than another (sidenote: that would be the end of SEO) but i do want full information about how their engine works for me as a searcher, so that i don't think i'm doing one thing, when in fact i'm doing another. Also, as a webmaster, i appreciate their "do and don't" guidelines, but i'd like to see them inforced as well as published.
So, to sum it up - i'm really sorry if i sounded arrogant, that was not my intention. I only intended to state that i was not very satisfied with the way Google had done their own job in explaining things.
/claus
[edited by: superscript at 3:35 pm (utc) on Dec. 5, 2003]
As far as algo goes, G's algo is one of the most exposed one, with numerous papers and patents etc. And, simply, what they tinker with their algo is no one's business but theirs.
Making me think.
I'm in the UK too. I am now getting so many hits on sub-pages and with longer search strings that my sales look much the same. They may even get better than before because my competitor's index pages, like mine, are off the radar on a simple search - and my site has more content.
It still riles me though, that a simple, generic KW1 KW2 throws up so much junk, when what was listed before was really pretty good.
I've spotted a new type of spam though - which may have helped some sites through Florida. I think I can see what is behind the technique, so I'm testing it out in a legitimate and open way.
--------
But if I were penalized wouldn't my homepage be taken out too? What do you think I did wrong on my inner pages? I am squeaky clean, no hidden text, nothing shady, just parsed the site. Could it be because of no 301 redirect?
This is what Peter Norvig, Google's Director of Quality, has said. Try:
[webmasterworld.com...] (msg #:164 by James_Dale) has a link to it.
While I certainly intend to add more content, the fact of the matter is the primary thing my visitor wants to see is housing inventory. That where 98% of the leads are generated. Adding pages and pages of content on all aspect of housing is doable if that is what it takes to satisfy google, but what happened to "build it for your user"?
[i.e. in the last 5 minutes ;) ]
on inbound link "penalties"?
('penalties' in quotes because I don't believe they exist -just relative downgrading of importance)
The reason I ask is that many large sites are auto generated to some extent - might a highly targetted inbound link coming from all 700 pages of a *single* site be considered a "penalty"?
Doubtful. I know of several sites (not mine) that link from every one of their pages to each others home page. Very targeted anchor text as well. They have all survived being displaced by the authority/hubs while their pre-florida top 10 competitors are buried.
Hmmm, very good observation. On my 3 keyword phrase which targets the first 2 as the most competitive phrase the last 2 keywords I have noticed a ranking boost as seperate terms. In other words. Both of these terms are competive terms as well on there own , but I am targeting the first 2 as a phrase. All three terms are in my anchor text in what used to be the correct sequence for ranking. I have been dropped like everone else on 3 word phrase and the first 2 phrase. An experiment in seprating all of the terms as seperate anchor text may work. I have also noticed that there is a mixrue(sites seem to be contain info on one or the other but rarely both together) of these first 2 keywords in the current serps as if Google is ranking both which would support the broad matching. Just do not know for sure right now. I am going to be patient and wate to after the next deep crawl and continue with observations before I make any decisions.
Interesting reply....
Do you suppose that the stories now hitting the press have simply materialized from thin air? They must simply be coming from disgruntled SEO professionals only. I mean THAT would be a BIG story... right?
Do you suppose that all of the small businesses that are affected have not voiced their opinion? Is it possible that the full effects of this have not been realized by joe user yet? Any possibility that joe wouldn't immediately turn to the mainstream press when they see poor serps? I think it's possible that many "joe users" are seeing poor serps and will respond by searching elsewhere... just my opinion of course.... I am allowed that... right?
Oh, I want to add that it seems that you are severely outnumbered in your thinking that Google has no problems....
But, of course, it's MY reality that is skewed.... right?
Back on topic...
I'm seeing a continuing slide in the serps for large cities. In checking yesterday there were several large cities with what I have come to expect as "relevant" serps. Today the serps for those cities have changed. Apparantly the cities are being hit by size perhaps and not by alphabet as was suggested earlier. It seems that the filter is rolling across the largest cities first. This seems to be the case for only certain KW's though. The sites that were there yesterday have been replaced by directories, educational pages that mention the topic, sites that are mostly repeated KW's and a few big players...
Very odd.... I wonder how far this will go? Anyone else seeing this?
Forget well written, clear and concise content. What works now for these searches is a free-for-all cornucopia of info and outbound links, all with no particular focus.
I'm seeing a continuing slide in the serps for large cities.
Very odd.... I wonder how far this will go? Anyone else seeing this?
____
I can confirm that. I watch the travel related serps about 12 hrs a day. The big cities were hit first, and now the smaller ones are being taken out one by one.
The only pattern I can see is size of city seems to mater.