Forum Moderators: open
Thank you,
Ryan Allis
On November 15, 2003, the SERPs (Search Engine Result Pages) in Google were dramatically altered. Although Google has been known to go through a reshuffling (appropriately named a Google Dance) every 2 months or so, this 'Dance' seems to be more like a drunken Mexican salsa that its usual conservative fox-trot.
Most likely, you will already know if your web site has been affected. You may have seen a significant drop-off in traffic around Nov. 15. Three of my sites have been hit. While one could understand dropping down a few positions, since November 15, the sites that previously held these rankings are nowhere to be found in the top 10,000 rankings. Such radical repositionings have left many mom-and-pop and small businesses devastated and out of luck for the holiday season. With Google controlling approximately 85% of Internet searches, many businesses are finding a need to lay off workers or rapidly cancel inventory orders. This situation deserves a closer look.
What the Early Research is Showing
From what early research shows, it seems that Google has put into place what has been quickly termed in the industry as an 'Over Optimization Penalty' (OOP) that takes into account the incoming link text and the on-site keyword frequency. If too many sites that link to your site use link text containing a word that is repeated more than a certain number of times on your home page, that page will be assessed the penalty and either demoted to oblivion or removed entirely from the rankings. In a sense Google is penalizing sites for being optimized for the search engines--without any forewarning of a change in policy.
Here is what else we know:
- The OOP is keyword specific, not site specific. Google has selected only certain keywords to apply the OOP for.
- Certain highly competitive keywords have lost many of the listings.
How to Know if Your Site Has Been Penalized
There are a few ways to know if your site has been penalized. The first, mentioned earlier, is if you noticed a significant drop in traffic around the 15th of November you've likely been hit. Here are ways to be sure:
1. Go to google.com. Type in any search term you recall being well-ranked for. See you site logs to see which terms you received search engine traffic from. If your site is nowhere to be found it's likely been penalized.
2. Type in the search term you suspect being penalized for, followed by "-dkjsahfdsaf" (or any other similar gibberish, without the quotes). This will remove the OOP and you should see what your results should be.
3. Or, simply go to www.**** to have this automated for you. Just type in the search term and see quickly what the search engine results would be if the OOP was not in effect. This site, put up less than a week ago, has quickly gained in popularity, becoming one of the 5000 most visited web sites on the Internet in a matter of days.
The Basics of SEO Redefined. Should One De-Optimize?
Search engine optimization consultants such as myself have known for years that the basics of SEO are:
- put your target keyword or keyphrase in your title, meta-tags, and alt-tags
- put your target keyword or keyphrase in an H1 tag near the top of your page
- repeat your keyword or keyphrase 5-10 times throughout the page
- create quality content on your site and update it regularly
- use a site map (linked to from every page) that links to all of your pages
- build lots of relevant links to your site
- ensure that your target keyword or keyphrase is in the link text of your incoming links
Now, however, the best practices for keyword frequency and link text will likely trigger the Google OOP. There is surely no denying that there are many low quality sites have used link farms and spammed blog comments in order to increase their PageRank (Google's measure of site quality) and link popularity. However, a differentiation must be made from these sites and quality sites with dozens or hundreds of pages of informational well-written content that have taken the time to properly build links.
So if you have been affected, what can you do? Should one de-optimize their site, or wait it out? Should one create one site for Google and one for the 'normal engines?' Is this a case of a filter been turned on too tight that Google will fix in a matter of days or something much more?
These are all serious questions that no one seems to have answers to. At this point we recommend making the following changes to your site if, and only if, your rankings seem to have been affected:
1. Contact a few of your link partners via email. Ask them to change the link text so that the keyword you have been penalized for is not in the link text or the keyphrase is in a different order than the order you are penalized for.
2. Open up the page that has been penalized (usually your home page) and reduce the number of times that you have the keyword on your site. Keep the number under 5 times for every 100 words you have on your page.
3. If you are targeting a keyphrase (a multiple-word keyword) reduce the number of times that your page has the target keyphrase in the exact order you are targeting. Mix up the order. For example, if you are targeting "Florida web designer" change this text on your site to "web site designer in florida" and "florida-based web site design services."
It is important to note that these 'de-optimization' steps should only be taken if you know that you have been affected by the Google OOP.
Why did Google do this? There are two possible answers. First, it is possible that Google has simply made an honest (yet very poor) attempt at removing many of the low-quality web sites in their results that had little quality content and received their positions from link farms and spamdexing. The evidence and the search engine results point to another potential answer.
A second theory, which has gained credence in the past days within the industry, is that in preparation for its Initial Public Offering (possibly this Spring), Google has developed a way to increase its revenue. How? By removing many of the sites that are optimized for the search engines on major commerical search terms, thereby increasing the use of its AdWords paid search results (cost-per-click) system. Is this the case? Maybe, maybe not.
Perhaps both of these reasons came into play. Perhaps Google execs thought they could
1) improve the quality of their rankings,
2) remove many of the 'spammy' low-quality sites
3) because of #2, increase AdWords revenues and
4) because of better results and more revenue have a better chance at a successful IPO.
Sadly, for Google, this plan had a detrimental flaw.
What Google Should Do
While there are positives that have come from this OOP filter, the filter needs to be adjusted. Here is what Google should do:
1. Post a communiqué on its web site explaining in as much detail as they are able what they have done and what they are doing to fix it;
2. Reduce the weight of OOP;
3. If the OOP is indeed a static penalty that can only be removed by a human, change it to a dynamic penalty that is analyzed and assessed with each major update; and
4. Establish an appeal process through which site owners which feel they are following all rules and have quality content can have a human (or enlightened spider) review their site and remove the OOP if appropriate.
When this recent update broke on November 15, webmasters clamored in the thousands to the industry forums such as webmasterworld.com. The mis-update was quickly titled "Florida Update 2003" and the initial common wisdom was that Google had made a serious mistake that would be fixed within 3-4 days and everyone should just stay put and wait for Google to 'fix itself.' While the rankings are still dancing, this fix has yet to come. High quality sites with lots of good content that have done everything right are being severely penalized.
If Google does not act quickly, it will soon lose market share and its reputation as the provider of the best search results. With Yahoo's recent acquisition of Inktomi, Alltheweb/FAST, and Altavista, it most likely will soon renege on its deal to serve Google results and may, in the process, create the future "best search engine on the 'net." Google, for now, has gone bananas in its recent meringue, and it may soon be spoiled rotten.
A couple of months ago I received a phone call from a company selling a service that was described to me as follows: For a fee a company can register a common name keyword. They claim to have 30,000,000 enabled browsers. When a user types the common name that the company has paid to register into the address bar of their browser, the company's website will immediately be displayed. They claim it also works this way on the major search engines including Google.
I would have posted the URL for this company but since this is my first post I am not sure if that is allowed.
It makes logical sense to me that this Florida update could be Google trying to combat this common name keyword company.
When a user types the common name that the company has paid to register into the address bar of their browser
That is where it all falls over! Users type keywords in SE I(mainly Google) not their Browser Address bar.
I have noticed the exact same thing. The sites that are showing up are loaded with links. Some of these sites have very little content.
Is it legal to link to my other webpages that have different products but still within the same general field of business. Or is that crosslinking and thus illegal?
Not only do i try to sell products on my site but i also try to inform my customer. Because i am not a brick and mortar establishment it is up to my customer to know exactly what they are purchasing. I certainly don't want any returns. I just don't see how having a 30-40 links on my pages helps my customers find the products they are looking for.
But if links is what Google wants for me to show up. All they have to do is tell me. Every single website will be one giant Link Farm with a bit of info.
Well, after a couple of weeks away from WW it seems I've got something to be thankful for as the world's gone crazy with the Florida thing. A few months ago I was disabled from AdSense after the dreaded fraudulent clicks message, but this update has been good to me as thankfully I somehow got top 5 positions for my non-commercial site in good 5,000,000-result keywords. I've been working on SEO since January so hooray for my site, and rotten luck for a lot of others it seems, hope you guys can bounce back quickly.
I followed the gist of Brett's step-by-step guide starting December 20th 2002, and simply systematically got links and wrote content. I'm happy to say I've gone from 10-3000 users a day this year - Wahay! Cheers Brett and WW! Next year will be solid, solid free content, and now the links look after themselves more which is great - no more emails as they come to me which is wonderful. I'll let you guys know end of 2004 how year 2 of the magic content+links goes, and let's see if I can get 8000+ users a day in 2004 - here's hoping. Just my two cents worth, and one happy chappie in what must be an awful update for many.
Merry Christmas ;)
Jeremy
I too have employed the back to basics method. Add text content pages daily, use Meta tags correctly and links all pages. This has sheltered us from the Florida update considerably. In fact, my main pages have now risen to #1 for my preferred search terms. Our traffic is now at 100,000 unique visits per month and climbing :o)
I hate to beat a dead horse, but I really believe there has been a site relative variable added to the algo in addition to other changes. You can call it SiteRank or Broomhilda for all I care, but there has been a fundamental change. This would account for the increase of inner, irrelevant, and low PR pages of directories/authorities/hubs dominating many industry SERP's. In addition, it would account for the prominence of .gov and .edu sites. After having PageRank bartered, bought, sold and generally bastardized, Google learned it's lesson. I think this is their way of refining a variable that helped get them to where they are now.
hhhmm... I've a theory from some weeks ago, and I feel that is very complementary to yours. I don't disagree with your theory about "SiteRank", but want to add something: Have you used the Toolbar v2? Have you seen the vote buttons? If not, you can show it from Options / More.
For the ones who have not seen these buttons, I'll explain:
Last versions of Google Toolbar include two new butons. One is a :) smiley in yellow and the other one is a blue :(
If you put the cursor over them, you can see the tooltip tex "Vote for this page" or "Vote against this page". After clicking any of them, the browser's status bar will show a text message saying "Your vote on this page has been sent to Google"
What will Google do with this information? There are many possibilities:
Google could use his data to modify the PR of each page, making it depend not only on the links but also on surfer's opinion.
The votes could be saved in a database in order to determine something like "SiteRank"
The votes could be being used to create another value, with the intention of replacing PageRank.
The most probably, the votes are being saved in a database while Google decides what use can be given to this info...
I do not know what they are, but the buttons are there... I suggest all of you to be prepared for anything. Maybe Florida has only been the beginning of something bigger...
There is no charm before the storm... only tempest
Greetings,
Herenvardö
You can call it SiteRank or Broomhilda for all I care, but there has been a fundamental change. This would account for the increase of inner, irrelevant, and low PR pages of directories/authorities/hubs dominating many industry SERP's.
Question: What algorithm do we know about which:
1. Restricts the found set to a maximum of 1000
2. Throws out related pages from within the same IP range and/or that are above a similarity threshold except for 2, the best two in terms of rank within the initial set before analysing the found set further. Amongst other things amounting to a site factor.
3. Has 3 human predetermined factors which can be turned up or down based on a process of trial and error to vary the sensitivity of the algorithm and which are based on assumptions about the motivations behind inter connectivity of web pages that just don't work in competitive commercial markets.
Answer: OrfdoUdqn Where O=L
Sid
Now, your (3)... that's something. I believe the two words i've typed second most in these threads (no 1 being "florida update") are outbound links.
I know it's not natural for commercial sites to link to the competition, although a few bold ones assured of their own quality does it, but still, there's so many other relevant pages to link to out there.
Consider it a benefit to those improving the web, and a slap over the hand to those hoarding and selling PR if you like. I believe the latter is just about the only business that Google does not like (in fact, even the most dodgy of blackhat SEO's should be considered a minor problem in comparison).
/claus
In your review of the new patent back in July I seem to remember that one of the last things you said was that linking to your competition might become a fashionable thing (or something like that).
I've decided to assume that there is a large element of what is described in that patent being used post Florida, I don't think it is the whole story but I do think that it is a direction that is worth exploring. So I'm busily working on linking with sites that I'm competing with in SERPs for a perticular keyword pair but not competing with in the real world.
Previously I would exchange links with good quality (measured subjectively and to a lesser extend objectively) sites that had links to the sites that I really do compete with. I also research my own and have developed a couple of sponsored sites on peripheral topics. Most of these have the term widgets in common with my main site. My main site sells widget insurance so now I'm looking for ways to get links from good quality sites that are in SERPs for widget insurance.
There are a couple of things that have started to deflect me a bit today and those are these puzzles.
1. I had back links from 4 sites all on the same server these are still there if I search for -fyfyfyf widget insurance but all of them are completely gone from the 797 pages listed in the full SERPs.
2. If I go to the very last page of the result set I see the usual message
In order to show you the most relevant results, we have omitted some entries very similar to the 797 already displayed.Now when I get it to repeat the search with the omitted results included I see exactly 1000 and on examining the SERPs the extra results are just multiple pages from sites that were listed first time round. It as though Google is finding 1000 pages and then restricting the number of pages returned from a site to two. It seems to be clustering the results possibly by site.
If you like, you can repeat the search with the omitted results included.
The new algo certainly seems to be doing some calculations on an initial found set of 1000 but I think that this initial set is not the same as the pre Florida results. Therefore there must be some other change in the algorithm which is used in selecting the 1000 to work on.
Looking at the top 2 results the pages are both directories with a relatively high page rank for our market. It is difficult to get a toolbar PR of 6 for a UK only small niche product but they have achieved it. Analysing a random sample of these page's backlinks they all seem to be from very similar related directories. It looks to me like they have grown their own PageRank. They use the term widget insurance once in their <title>, once in a navigation bar link, once in a link to an Espotting sign up page and the other 14 times that the term widget insurance is used are all in Espotting ads, the top one of which is mine. Of course the links on these ads go through a gateway and thus have no PR value.
So how do you compete with a page which Google suggests has only 15 links into it all from within the same site and the index page of that site has a PR of 7 from 4800 links in almost all in not all from within sites controlled by the same owner and whose only content is Espotting ads?
Beats me but if anyone has any answers that don't just go on about content and write your pages for humans to view rather than for spiders to eat I would very much like to hear it.
Best wishes
Sid
>> It as though Google is finding 1000 pages and then restricting the number of pages returned
Perhaps i don't understand exactly what you mean, but... the 1,000 is not like it's carved in stone or anything, it might as well be 10 or 10,000. The thing with the max. 1,000 results in serps is the same as when i wrote those posts - that was the reason i chose that number (as far as i recall). I'm not sure Florida has changed anything there.
All you need is an "initial set" of results that match the query - this list of results will then be reordered. If this "initial set" is larger than 1,000 and the SERPS only display 1,000 results, then of course some pages will drop completely off the SERPS. Still, although some pages may have done that with Update Florida this is not evidence that Google is using this particular method, as these pages might have dropped for other reasons. That was what i was thinking about when i wrote that it might add more confusion than clarity at this moment.
So, you can not easily prove that this is it. Then, can you disprove it? Apparently you just did so by stating that some page had good ranking regardless of a large number of backlinks from other pages having the same owner.
Still, this is not evidence either. The patent aims at giving site owners minimal influence over the votes cast (inbound links are votes) for their own pages - it explicitly limits the number of votes from the same set of closely related pages, aka. "the neighborhood". So, we're back to defining "the neighborhood" - is it same site, same domain, same IP, same owner per whois, other owner in same firm/family, part of same network of sites, or the set of reciprocal links? The patent mentions IP and domain as examples, but it is very close to suggesting that imagination is the limit. It tells us nothing about which of these possibilities Google is capable of identifying properly at this moment. Same site would be relatively easy, but the others are of course harder.
There are examples of networks of sites that do well after Florida, so that could disprove the patent, but it could also indicate that it's there, only all these networks are just not identified properly yet. So, speculation gives no clear answer at this point.
>> linking to your competition might become a fashionable thing (or something like that).
In fact it was the reverse, ie. links from your direct competition in the SERPS would benefit you. For firms to link to competitors, either you have to be a real authority on a subject as well as a competitor, or you'll have to pay them. So much for fashion.
The Hilltop [cs.toronto.edu]-algorithm(*), otoh, appears to be working the opposite way of the LocalRank. Outbound links to your competition seems to be what you want in that model. It seems to be explaining some things better, but i think this model is just as hard to prove or disprove using current SERPS as the other one.
Hubs and authorities. Popularly speaking, LocalRank seems to be about authorities while Hilltop is about hubs. Presently we see both hubs and authorities getting preferential treatment in the SERPS, and it's not making things easier that sometimes some pages might be both. (AFAIK, FWIW, IMHO)
So, are they using this one? that one? Both? Neither one? A third one? It all ends in speculation too easily at this moment. Also, they could just switch to another model, i do feel quite convinced that both models would give good results for the searcher, although the consequences for webmasters would possibly be very different.
/claus