Forum Moderators: open
Thank you,
Ryan Allis
On November 15, 2003, the SERPs (Search Engine Result Pages) in Google were dramatically altered. Although Google has been known to go through a reshuffling (appropriately named a Google Dance) every 2 months or so, this 'Dance' seems to be more like a drunken Mexican salsa that its usual conservative fox-trot.
Most likely, you will already know if your web site has been affected. You may have seen a significant drop-off in traffic around Nov. 15. Three of my sites have been hit. While one could understand dropping down a few positions, since November 15, the sites that previously held these rankings are nowhere to be found in the top 10,000 rankings. Such radical repositionings have left many mom-and-pop and small businesses devastated and out of luck for the holiday season. With Google controlling approximately 85% of Internet searches, many businesses are finding a need to lay off workers or rapidly cancel inventory orders. This situation deserves a closer look.
What the Early Research is Showing
From what early research shows, it seems that Google has put into place what has been quickly termed in the industry as an 'Over Optimization Penalty' (OOP) that takes into account the incoming link text and the on-site keyword frequency. If too many sites that link to your site use link text containing a word that is repeated more than a certain number of times on your home page, that page will be assessed the penalty and either demoted to oblivion or removed entirely from the rankings. In a sense Google is penalizing sites for being optimized for the search engines--without any forewarning of a change in policy.
Here is what else we know:
- The OOP is keyword specific, not site specific. Google has selected only certain keywords to apply the OOP for.
- Certain highly competitive keywords have lost many of the listings.
How to Know if Your Site Has Been Penalized
There are a few ways to know if your site has been penalized. The first, mentioned earlier, is if you noticed a significant drop in traffic around the 15th of November you've likely been hit. Here are ways to be sure:
1. Go to google.com. Type in any search term you recall being well-ranked for. See you site logs to see which terms you received search engine traffic from. If your site is nowhere to be found it's likely been penalized.
2. Type in the search term you suspect being penalized for, followed by "-dkjsahfdsaf" (or any other similar gibberish, without the quotes). This will remove the OOP and you should see what your results should be.
3. Or, simply go to www.**** to have this automated for you. Just type in the search term and see quickly what the search engine results would be if the OOP was not in effect. This site, put up less than a week ago, has quickly gained in popularity, becoming one of the 5000 most visited web sites on the Internet in a matter of days.
The Basics of SEO Redefined. Should One De-Optimize?
Search engine optimization consultants such as myself have known for years that the basics of SEO are:
- put your target keyword or keyphrase in your title, meta-tags, and alt-tags
- put your target keyword or keyphrase in an H1 tag near the top of your page
- repeat your keyword or keyphrase 5-10 times throughout the page
- create quality content on your site and update it regularly
- use a site map (linked to from every page) that links to all of your pages
- build lots of relevant links to your site
- ensure that your target keyword or keyphrase is in the link text of your incoming links
Now, however, the best practices for keyword frequency and link text will likely trigger the Google OOP. There is surely no denying that there are many low quality sites have used link farms and spammed blog comments in order to increase their PageRank (Google's measure of site quality) and link popularity. However, a differentiation must be made from these sites and quality sites with dozens or hundreds of pages of informational well-written content that have taken the time to properly build links.
So if you have been affected, what can you do? Should one de-optimize their site, or wait it out? Should one create one site for Google and one for the 'normal engines?' Is this a case of a filter been turned on too tight that Google will fix in a matter of days or something much more?
These are all serious questions that no one seems to have answers to. At this point we recommend making the following changes to your site if, and only if, your rankings seem to have been affected:
1. Contact a few of your link partners via email. Ask them to change the link text so that the keyword you have been penalized for is not in the link text or the keyphrase is in a different order than the order you are penalized for.
2. Open up the page that has been penalized (usually your home page) and reduce the number of times that you have the keyword on your site. Keep the number under 5 times for every 100 words you have on your page.
3. If you are targeting a keyphrase (a multiple-word keyword) reduce the number of times that your page has the target keyphrase in the exact order you are targeting. Mix up the order. For example, if you are targeting "Florida web designer" change this text on your site to "web site designer in florida" and "florida-based web site design services."
It is important to note that these 'de-optimization' steps should only be taken if you know that you have been affected by the Google OOP.
Why did Google do this? There are two possible answers. First, it is possible that Google has simply made an honest (yet very poor) attempt at removing many of the low-quality web sites in their results that had little quality content and received their positions from link farms and spamdexing. The evidence and the search engine results point to another potential answer.
A second theory, which has gained credence in the past days within the industry, is that in preparation for its Initial Public Offering (possibly this Spring), Google has developed a way to increase its revenue. How? By removing many of the sites that are optimized for the search engines on major commerical search terms, thereby increasing the use of its AdWords paid search results (cost-per-click) system. Is this the case? Maybe, maybe not.
Perhaps both of these reasons came into play. Perhaps Google execs thought they could
1) improve the quality of their rankings,
2) remove many of the 'spammy' low-quality sites
3) because of #2, increase AdWords revenues and
4) because of better results and more revenue have a better chance at a successful IPO.
Sadly, for Google, this plan had a detrimental flaw.
What Google Should Do
While there are positives that have come from this OOP filter, the filter needs to be adjusted. Here is what Google should do:
1. Post a communiqué on its web site explaining in as much detail as they are able what they have done and what they are doing to fix it;
2. Reduce the weight of OOP;
3. If the OOP is indeed a static penalty that can only be removed by a human, change it to a dynamic penalty that is analyzed and assessed with each major update; and
4. Establish an appeal process through which site owners which feel they are following all rules and have quality content can have a human (or enlightened spider) review their site and remove the OOP if appropriate.
When this recent update broke on November 15, webmasters clamored in the thousands to the industry forums such as webmasterworld.com. The mis-update was quickly titled "Florida Update 2003" and the initial common wisdom was that Google had made a serious mistake that would be fixed within 3-4 days and everyone should just stay put and wait for Google to 'fix itself.' While the rankings are still dancing, this fix has yet to come. High quality sites with lots of good content that have done everything right are being severely penalized.
If Google does not act quickly, it will soon lose market share and its reputation as the provider of the best search results. With Yahoo's recent acquisition of Inktomi, Alltheweb/FAST, and Altavista, it most likely will soon renege on its deal to serve Google results and may, in the process, create the future "best search engine on the 'net." Google, for now, has gone bananas in its recent meringue, and it may soon be spoiled rotten.
I think the word "filter" is crudely used here considering you are dealing with a database of billions of documents and that Google returns search results in micro seconds. I am sure, they would remotely use anything that aint automated. And word filtering (from Adwords broadmatch etc..) aint stand the test of the nature of the search queries handled. I am more inclined towards sophisticated text analysis, considering Applied Symantics is under the Google's roof now. Try this: http://azeem.azhar.co.uk/archives/000571.php (no, it's not my link and i aint got anything to do with it)
[edited by: DaveAtIFG at 10:30 pm (utc) on Dec. 4, 2003]
[edit reason] DeLinked [/edit]
I am more inclined towards sophisticated text analysis, considering Applied Symantics is under the Google's roof now
I wish this was the case, because then the focus would be on relevant, written-for-the-user content, but dont think so at the moment. It doesnt explain the weather sites, link directories, the serps for shelving mentioned earlier, or the page about house rabbits that shows up for 'city homes'.
But perhaps they're using unsophisticated text analysis.
After all these posts, nobody complaining about the quality of the serps (rather than a lost site) has had the courage to make the absurd case that the old pre-florida anchor-text-only method of ranking was a good one. No one makes that case because the idea is plainly absurd. Just because you make 1000 more anchor text links to one of your pages does not mean that this page has now magically gotten to be higher quality. allinanchor: or -ysystsr might coincidentally show a good site not ranked otherwise, but the idea that search results should be shown based on volume of anchor text is plainly indefensible.
Anyone who advocates using -usrtsrse to get the spammy results might as well learn to just type in allinanchor:keyword as their modifier instead, and then start trying to justify how "volume of anchor text is the best algorithm."
We have modest variety of anchor text. Using the -sdfds -sfsdf method shows us #4 and using allinanchor shows us at like 70. It's not the same thing.
to get the spammy results
pretty biased opinion here.
Sorry to hear that. Open your mind a little more.
The -systsrs qualifier is just similar to allinanchor, not exactly the same. Some sites with a great deal of other positive factors managed to rank well pre-florida without much anchor text, but there is no denying that for pre-florida competitive searches the results closely paralleled allinanchor.
For a 3 keyword seach I watch, none of the top 30 sites used any obviously deceptive techniques, no affiliate sites, just clean sites that no longer exist in the top 1000. They are right back into there former position using -ewr -ewwef
edit: just read this
The -systsrs qualifier is just similar to allinanchor, not exactly the same. Some sites with a great deal of other positive factors managed to rank well pre-florida without much anchor text, but there is no denying that for pre-florida competitive searches the results closely paralleled allinanchor
I agree
True, but I am seeing dramatically different allinanchor results now that I doubt are accurate. The same directory type pages that now show up high are also doing the same for allinanchor.
I have one site that had backlinks updated and the number doubled. Most had same AT. However now this site drops from #2 for allinanchor to #39? I seriously doubt the current accuracy of allinanchor.
"I just think your statement was a blanket statement that is not 100% true and is biased because you have unfortunately been subjected to spammy sites"
Sure my main hyper-competitive niche has been awash in spam, and the improvement post-florida is almost inspirational (although still very far from perfect). Also, there is no doubt many good sites have been affected by florida. Two sites of friends of mine are classic examples of how (totally benign) duplicate content can sink a site post-florida, temporarily at least.
However, I can't see how anybody can make the case that the post-florida serps are not less spammy. The least relevant results for any term are now much more likely to be off-topicish (that Iowa shelving page) or directory-ish (link pages). These aren't spammy. These are just not great results. No-content spam still exists, but in volume it is less. If someone wants to make a case that "spam" has been replaced sometimes by "lightweight off-topic", fine. But I for one sure consider that a drastic improvement. Give me that Iowa shelving contruction page any day over a doorway to a doorway of a doorway with 10,000 anchor text links pointing at it from domains owned by the same entity.
Florida, or Galen if you prefer, needs to just get a bit more niche-relevant, and solve the duplicate content problems, and maybe some other ones. The basic results though are less spammy and the root of the algorithm is a dramatic improvement.
Allintitle, allintext and allinanchor results drastically different than pre-florida. Why would that be?
When I did this test on my "disaster" search term the top ten were pretty much per Florida in each case. There is a bit of shuffling but the top ten pre Florida appear there in the top ten when filtered with whichever filter I used.
Take a look at what the filters do to the search here
[google.com...]
and in this discussion [webmasterworld.com...]
Basically each of these filters limits the search to one factor on the page for example Allintitle: looks for the search term in the title. In my market I guess that the top ten are all pretty well optimised for each of the main ranking factors for that particular search phrase. In your market it looks like things are different possibly sites are not so "well" optimised.
Best wishes
Sid
PS I wonder how many of the folks who contributed to this thread are busy eating their words now [webmasterworld.com...]
I honestly think we should be thinking along different lines.
pre-florida, there was the perception that outbound links should be used sparingly because they would "leak" pagerank.
post florida, if the directory / authority argument carries weight (which I believe it does - how else can you explain sites placing well in the SERPS solely on the basis of their outgoing anchor text?), it would appear that outbound links to quality, related sites can help your position in the SERPS.
if this holds up, this is a major shift - and one for the better IMHO. it had always seemed odd, and somewhat mean-spirited of google to discourage sites from linking to each other by brandishing the stick of leaking pagerank.
I have 4 well ranked sites. Only one of my sites was affected by this latest update. All four sites are very similar in layout and design (we sell our products to 4 different markets).
The site that fell off the face of the earth has a PR 6. It has 176 quality backlinks. It is in a very under-optimized field with very little real competition.
I tried the -dsghhh trick and found that it showed only one of the two most used kw phrases was penalized although my site doesn't show up under either. It was #3 for both last month.
I had a lot of the "penalized" kw phrases repeated on one section of my site. My feeling is my penalization for this particular site was because of that repeated phrase. None of the other sites I have that component.
I have removed the repeated keywords. We'll see what happens next month!
This is the only idea that has made sense to me in terms of what we know about Google: Google is full of genius and strives for the best possible results.
Google's apparent contentment with the current serps only makes sense if this is just step 1: finding the 'authority sites' (and tweaking the algorithm to get rid of sites that just look like authority sites).
This is the only way I can see for Google to eliminate the affiliate spammers while keeping the good sites. As Make Me Top says:
If this algo is being used then it is its own built in 'spam' filter because nobody that would qualify as an expert would have a link to a site that does not fit their content. Therefore there is no penalty, just a new algo that washes you out of the mix if you don't fit in. And what you are seeing is the middle of the change over.
I also think Google's silence and the timing of this 'update' is no accident: I'm sure they were well aware that a lot of sites will do damage control during this buying season and turn to Adwords (again, they aren't stupid!).
If all this is true, then Google is brilliant. And mean!
Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.
It seems as if Google is contradicting itself with this update. The biggest problem right now is that the words users would type to find our pages, are exactly the words that Google is punishing us for having on our site. It is the words that users would NOT normally type to find our pages that Google is not filtering / punishing / ignoring / whatever phrase you want to use here.
My website is named "The Widget A and Widget B Society"
My 'Widget A' term has 3.7 million results in Google
My 'Widget B' term has 1.3 million results in Google
Previously I was ranked 78th for 'Widget A' term, and 31st for 'Widget B' term, which is reasonable.
After Florida I am ranked 34th for 'Widget A' term, and 80th for 'Widget B' term. I don't know why this has happened by am happy that I have moved higher in the rankings for a competitive term.
One possible explanation is that Google is using the ordering of the link text, meaning that the first term in the link text is more important than the second term, and the second term is more important than the third term, and so on.
Also, I have completely deleted any copy from the front page of my site, including the introduction message and all text that might give Google the impression of 'keyword stuffing'. My front page is now simply a logo and a collection of links
Has anyone else had similar experiences?
We do nothing shady, I have keywords in title, urls are parsed with keyword in parsing. I am keywords in header and then in body. Here is how my links look:
[widgets.com...]
Any can offer any guidance as to why my site only has its homepage in now?
How can I tell if other pages are indexed besides searching for certain phrases within them?
Thanks so much!
replace 'keywords' with what terms you are expecting your 'penalised' pages should show up on. replace example.com with your domain. if you find them, oop penality, if not found at all, you're in trouble. ;-(
"New sites, changes to existing sites, and dead links will all be noted in the course of the next crawl, which will be completed before the holiday season ends."
The detailed version can be found here:
groups.google.com/groups?dq=&hl=en&lr=&ie=UTF-8&group=google.public.support.general &selm=35849812.0312040716.360dd410%40posting.google.com
[edited by: DaveAtIFG at 4:24 pm (utc) on Dec. 5, 2003]
[edit reason] Fixed sidescrolling [/edit]
changes to existing sites, and dead links will all be noted in the course of the next crawl
We will see what happens to the ones de-optimizing. Place your bets.
Back to the buckshot - it still seems to me that there is more than one component at work here, although i personally have been preoccupied with getting an understanding of the "broad match" as that is the new thing, IMHO (still, for lack of better words). As these threads do discuss all kinds of things, i have compiled a list.
These are my premilinary observations, they might not all be true, and i might have forgotten something as well - also, some of these bullet points may be closely related (or even the same thing). Order does not reflect importance:
That was about everything i could think of at the moment. I hope you'll find it useful, although i wouldn't recommend you to charge $95 for it. Also, i hope that this might inspire some thoughts so that we can get the rest pinned down - any thoughts?
/claus