Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: open
On November 15, 2003, the SERPs (Search Engine Result Pages) in Google were dramatically altered. Although Google has been known to go through a reshuffling (appropriately named a Google Dance) every 2 months or so, this 'Dance' seems to be more like a drunken Mexican salsa that its usual conservative fox-trot.
Most likely, you will already know if your web site has been affected. You may have seen a significant drop-off in traffic around Nov. 15. Three of my sites have been hit. While one could understand dropping down a few positions, since November 15, the sites that previously held these rankings are nowhere to be found in the top 10,000 rankings. Such radical repositionings have left many mom-and-pop and small businesses devastated and out of luck for the holiday season. With Google controlling approximately 85% of Internet searches, many businesses are finding a need to lay off workers or rapidly cancel inventory orders. This situation deserves a closer look.
What the Early Research is Showing
From what early research shows, it seems that Google has put into place what has been quickly termed in the industry as an 'Over Optimization Penalty' (OOP) that takes into account the incoming link text and the on-site keyword frequency. If too many sites that link to your site use link text containing a word that is repeated more than a certain number of times on your home page, that page will be assessed the penalty and either demoted to oblivion or removed entirely from the rankings. In a sense Google is penalizing sites for being optimized for the search engines--without any forewarning of a change in policy.
Here is what else we know:
- The OOP is keyword specific, not site specific. Google has selected only certain keywords to apply the OOP for.
- Certain highly competitive keywords have lost many of the listings.
How to Know if Your Site Has Been Penalized
There are a few ways to know if your site has been penalized. The first, mentioned earlier, is if you noticed a significant drop in traffic around the 15th of November you've likely been hit. Here are ways to be sure:
1. Go to google.com. Type in any search term you recall being well-ranked for. See you site logs to see which terms you received search engine traffic from. If your site is nowhere to be found it's likely been penalized.
2. Type in the search term you suspect being penalized for, followed by "-dkjsahfdsaf" (or any other similar gibberish, without the quotes). This will remove the OOP and you should see what your results should be.
3. Or, simply go to www.**** to have this automated for you. Just type in the search term and see quickly what the search engine results would be if the OOP was not in effect. This site, put up less than a week ago, has quickly gained in popularity, becoming one of the 5000 most visited web sites on the Internet in a matter of days.
The Basics of SEO Redefined. Should One De-Optimize?
Search engine optimization consultants such as myself have known for years that the basics of SEO are:
- put your target keyword or keyphrase in your title, meta-tags, and alt-tags
- put your target keyword or keyphrase in an H1 tag near the top of your page
- repeat your keyword or keyphrase 5-10 times throughout the page
- create quality content on your site and update it regularly
- use a site map (linked to from every page) that links to all of your pages
- build lots of relevant links to your site
- ensure that your target keyword or keyphrase is in the link text of your incoming links
Now, however, the best practices for keyword frequency and link text will likely trigger the Google OOP. There is surely no denying that there are many low quality sites have used link farms and spammed blog comments in order to increase their PageRank (Google's measure of site quality) and link popularity. However, a differentiation must be made from these sites and quality sites with dozens or hundreds of pages of informational well-written content that have taken the time to properly build links.
So if you have been affected, what can you do? Should one de-optimize their site, or wait it out? Should one create one site for Google and one for the 'normal engines?' Is this a case of a filter been turned on too tight that Google will fix in a matter of days or something much more?
These are all serious questions that no one seems to have answers to. At this point we recommend making the following changes to your site if, and only if, your rankings seem to have been affected:
1. Contact a few of your link partners via email. Ask them to change the link text so that the keyword you have been penalized for is not in the link text or the keyphrase is in a different order than the order you are penalized for.
2. Open up the page that has been penalized (usually your home page) and reduce the number of times that you have the keyword on your site. Keep the number under 5 times for every 100 words you have on your page.
3. If you are targeting a keyphrase (a multiple-word keyword) reduce the number of times that your page has the target keyphrase in the exact order you are targeting. Mix up the order. For example, if you are targeting "Florida web designer" change this text on your site to "web site designer in florida" and "florida-based web site design services."
It is important to note that these 'de-optimization' steps should only be taken if you know that you have been affected by the Google OOP.
Why did Google do this? There are two possible answers. First, it is possible that Google has simply made an honest (yet very poor) attempt at removing many of the low-quality web sites in their results that had little quality content and received their positions from link farms and spamdexing. The evidence and the search engine results point to another potential answer.
A second theory, which has gained credence in the past days within the industry, is that in preparation for its Initial Public Offering (possibly this Spring), Google has developed a way to increase its revenue. How? By removing many of the sites that are optimized for the search engines on major commerical search terms, thereby increasing the use of its AdWords paid search results (cost-per-click) system. Is this the case? Maybe, maybe not.
Perhaps both of these reasons came into play. Perhaps Google execs thought they could
1) improve the quality of their rankings,
2) remove many of the 'spammy' low-quality sites
3) because of #2, increase AdWords revenues and
4) because of better results and more revenue have a better chance at a successful IPO.
Sadly, for Google, this plan had a detrimental flaw.
What Google Should Do
While there are positives that have come from this OOP filter, the filter needs to be adjusted. Here is what Google should do:
1. Post a communiqué on its web site explaining in as much detail as they are able what they have done and what they are doing to fix it;
2. Reduce the weight of OOP;
3. If the OOP is indeed a static penalty that can only be removed by a human, change it to a dynamic penalty that is analyzed and assessed with each major update; and
4. Establish an appeal process through which site owners which feel they are following all rules and have quality content can have a human (or enlightened spider) review their site and remove the OOP if appropriate.
When this recent update broke on November 15, webmasters clamored in the thousands to the industry forums such as webmasterworld.com. The mis-update was quickly titled "Florida Update 2003" and the initial common wisdom was that Google had made a serious mistake that would be fixed within 3-4 days and everyone should just stay put and wait for Google to 'fix itself.' While the rankings are still dancing, this fix has yet to come. High quality sites with lots of good content that have done everything right are being severely penalized.
If Google does not act quickly, it will soon lose market share and its reputation as the provider of the best search results. With Yahoo's recent acquisition of Inktomi, Alltheweb/FAST, and Altavista, it most likely will soon renege on its deal to serve Google results and may, in the process, create the future "best search engine on the 'net." Google, for now, has gone bananas in its recent meringue, and it may soon be spoiled rotten.
If you read what the OP stated: "Google has selected only certain keywords to apply the OOP for." Is it possible these shopping malls and directories aren't using penalized keywords? Also, this list of keywords would have to be hand created. Otherwise too many cases of brand.com getting penalized for "brand". These sites typically have a huge number of inbound anchor text with "brand". Thus, Google when creating this keyword list may have been looking for cases where they wouldn't hurt well known, clean sites. Like shopping malls and directories. They may have even whitelisted the ones they knew about so as not to cause collateral damage.
Every page I have has our company name, address, phone number, etc. on it so that people know who and where we are. This also apparently seems to be one of their KW filters. Having your name and address on at least one page was a recommendation by yahoo years ago, and makes good sense even if G doesn’t think so. So they are such a monopoly now that they can dictate what my business principals should be and can dictate how I run my company? Isn’t that what M$ got into so much hot water for? And if they did anything to the results to increase ad revenues then they may have big problems with the FTC just for openers.
These are some queries that are penalized, according to that website that most of the conspiracy-advocates seem to use... Below are real queries from November 30.:
It is quite easy to find other similarly nonsensical "banned" phrases. Now, do any of you seriously believe that any of these keyword phrases return "too optimised" sites? Do they seem commercial?
What you think you are seeing is in fact something else. The terms you think are "banned" seems, in stead, to be terms that (return a range of pages that do not) qualify for a "broad match". The exact nature of this broad match is what you should be reflecting on in stead of seeking conspiracies.
Added: sorry if i'm sounding rude, it annoys me to see stuff like this repeated when it is so easy to see it's not true
[edited by: claus at 5:48 pm (utc) on Dec. 3, 2003]
Actually I am torn between the commercial kw thingy, and the notion that they are looking for very accurate matches with pages
There's something in this - I'm sunk for most 2 KW phrases - but pop up fine right at the top of the SERPs for more specific 3 KW phrases. It has certainly hurt my revenue, but customers are still managing to find my products. This is why its so hard to figure whether I have been 'penalised' as such.
Say a company has a main domain A and a secondary domain B. B has many links to A, all featuring the same phrase. "Blue Widgets in X" where X may be the name of a state etc etc.
If it is inward link related then it could be an issue. What's to stop me now setting up multiple links to my competitor featuring the phrase? That's what makes me hesitate to say that it is this alone....
I know that the top site in the category i'm looking at has a 2 keey word density in excess of 38% so I dont believe it is KW density alone.
I'm suspicious of this link problem thingy - if the text used in the links is a culprit, how come so many shopping malls and directories are listing for top kw's?Just a complete guess out of nowhere, but what if it had something to do with excessive link text as well? Since most links are fairly short, this would indicate "keyword stuffing" on link exchanges if you see the same long link text showing up time and again. For example, I've been requesting that people who link to our site use link text like "KeyPhrase 1 from OurCompany - Since 1956" (depending on the page they're linking to); however, I've seen people request that links to them use link text like "TheirCompany - key phrase 1, key phrase 2, and key phrase 3" or something to that effect.
Of course, if it is a link text issue, or even an issue of how people link to you, fixing it's going to be no easy task if you've got a whole lot of links. Then again, it's always been said you can't be penalized for people linking to you, simply because this would allow people to sabotage you.
[edited by: DylanW at 5:57 pm (utc) on Dec. 3, 2003]
I don’t think it is an issue of that as much as G has made a decision on what is a broad match, and sometimes their theory isn’t correct. Sometimes a set of KWs MUST be used together for the context to make sense to a human reader. I am also convinced that based on the fact that not only have my hits gone down, but people are looking for us with my name or our company name in G and the number of hits from other SEs is starting to go up, as is traffic. But very, very slowly. Based on my 2 weeks or so of data, it does seem to be a trend, but too soon to call. So G may NOT be supplying the end users with what they are looking for now  at least based on one set of 2 word phrases[/edit]. Personally, I feel I almost have enough data and statistics to turn it over to the FTC for an investigation.
claus makes a convincing argument. 70 out of the top 100 sites for "foo bar" have disappeared in this last update? That makes no sense at all on the theory there is an over-optimization penalty, or a "commercial keyword list" Google is going after. The evidence claus presents points more in the direction of some major shift in the algo.
<edit: and potentially mess up the listings of competing search engines if webmasters react to it by reducing keyword density >
[edited by: superscript at 6:12 pm (utc) on Dec. 3, 2003]
Here's my observations based on some of my key phrases, using the method outlined in the original post.
1) On some I am doing better because some spammy sites have been removed. Hoorah for me but not very interesting.
2) On one major key phrase I am doing the same as before, but it is different sites that are above and below me. What this shows is that it is not simply a case of certain sites being "penalised" or not. If my site were "penalised" it wouldn't be #6 out of about 6 million, as it was before. However, some other sites have been promoted ahead of it, to replace the ones that have been "penalised". If there is an OOP then it is on some kind of sliding scale rather than a binary on-off thing. Which would make some kind of sense given Google's general approach to these things. Incidentally, in this case at least some of the sites "removed"/"penalised" weren't obvious stuffers.
[edited by: HenryUK at 6:15 pm (utc) on Dec. 3, 2003]
Point taken, but the SERPs are a bit of a mess, there's a lot of crawling to do. It's possible some sites have slipped through the net. Certainly low keyword density looks like a major factor to me. The only competitor of mine who hasn't budged an inch in the SERPs and is still at number one, has only a single mention of the main keyword in the body text of his index page.
I think this is a sound explanation.
Both the 'over-seo' and 'commercial keyword list' arguments have to many proven exceptions to be valid.
Combine an algo shift and a shake-up in the Google directory and does that result in Florida?.
That should settle the question of whether it is anchor text related.
800,000+ links to them with identical kw1 kw2 anchor text ought to have some impact if that is a criteria.
It looks like he's got a very large number of low ranking, but genuine and organic inbound links.
[edited by: superscript at 6:24 pm (utc) on Dec. 3, 2003]