Forum Moderators: open
Thank you,
Ryan Allis
On November 15, 2003, the SERPs (Search Engine Result Pages) in Google were dramatically altered. Although Google has been known to go through a reshuffling (appropriately named a Google Dance) every 2 months or so, this 'Dance' seems to be more like a drunken Mexican salsa that its usual conservative fox-trot.
Most likely, you will already know if your web site has been affected. You may have seen a significant drop-off in traffic around Nov. 15. Three of my sites have been hit. While one could understand dropping down a few positions, since November 15, the sites that previously held these rankings are nowhere to be found in the top 10,000 rankings. Such radical repositionings have left many mom-and-pop and small businesses devastated and out of luck for the holiday season. With Google controlling approximately 85% of Internet searches, many businesses are finding a need to lay off workers or rapidly cancel inventory orders. This situation deserves a closer look.
What the Early Research is Showing
From what early research shows, it seems that Google has put into place what has been quickly termed in the industry as an 'Over Optimization Penalty' (OOP) that takes into account the incoming link text and the on-site keyword frequency. If too many sites that link to your site use link text containing a word that is repeated more than a certain number of times on your home page, that page will be assessed the penalty and either demoted to oblivion or removed entirely from the rankings. In a sense Google is penalizing sites for being optimized for the search engines--without any forewarning of a change in policy.
Here is what else we know:
- The OOP is keyword specific, not site specific. Google has selected only certain keywords to apply the OOP for.
- Certain highly competitive keywords have lost many of the listings.
How to Know if Your Site Has Been Penalized
There are a few ways to know if your site has been penalized. The first, mentioned earlier, is if you noticed a significant drop in traffic around the 15th of November you've likely been hit. Here are ways to be sure:
1. Go to google.com. Type in any search term you recall being well-ranked for. See you site logs to see which terms you received search engine traffic from. If your site is nowhere to be found it's likely been penalized.
2. Type in the search term you suspect being penalized for, followed by "-dkjsahfdsaf" (or any other similar gibberish, without the quotes). This will remove the OOP and you should see what your results should be.
3. Or, simply go to www.**** to have this automated for you. Just type in the search term and see quickly what the search engine results would be if the OOP was not in effect. This site, put up less than a week ago, has quickly gained in popularity, becoming one of the 5000 most visited web sites on the Internet in a matter of days.
The Basics of SEO Redefined. Should One De-Optimize?
Search engine optimization consultants such as myself have known for years that the basics of SEO are:
- put your target keyword or keyphrase in your title, meta-tags, and alt-tags
- put your target keyword or keyphrase in an H1 tag near the top of your page
- repeat your keyword or keyphrase 5-10 times throughout the page
- create quality content on your site and update it regularly
- use a site map (linked to from every page) that links to all of your pages
- build lots of relevant links to your site
- ensure that your target keyword or keyphrase is in the link text of your incoming links
Now, however, the best practices for keyword frequency and link text will likely trigger the Google OOP. There is surely no denying that there are many low quality sites have used link farms and spammed blog comments in order to increase their PageRank (Google's measure of site quality) and link popularity. However, a differentiation must be made from these sites and quality sites with dozens or hundreds of pages of informational well-written content that have taken the time to properly build links.
So if you have been affected, what can you do? Should one de-optimize their site, or wait it out? Should one create one site for Google and one for the 'normal engines?' Is this a case of a filter been turned on too tight that Google will fix in a matter of days or something much more?
These are all serious questions that no one seems to have answers to. At this point we recommend making the following changes to your site if, and only if, your rankings seem to have been affected:
1. Contact a few of your link partners via email. Ask them to change the link text so that the keyword you have been penalized for is not in the link text or the keyphrase is in a different order than the order you are penalized for.
2. Open up the page that has been penalized (usually your home page) and reduce the number of times that you have the keyword on your site. Keep the number under 5 times for every 100 words you have on your page.
3. If you are targeting a keyphrase (a multiple-word keyword) reduce the number of times that your page has the target keyphrase in the exact order you are targeting. Mix up the order. For example, if you are targeting "Florida web designer" change this text on your site to "web site designer in florida" and "florida-based web site design services."
It is important to note that these 'de-optimization' steps should only be taken if you know that you have been affected by the Google OOP.
Why did Google do this? There are two possible answers. First, it is possible that Google has simply made an honest (yet very poor) attempt at removing many of the low-quality web sites in their results that had little quality content and received their positions from link farms and spamdexing. The evidence and the search engine results point to another potential answer.
A second theory, which has gained credence in the past days within the industry, is that in preparation for its Initial Public Offering (possibly this Spring), Google has developed a way to increase its revenue. How? By removing many of the sites that are optimized for the search engines on major commerical search terms, thereby increasing the use of its AdWords paid search results (cost-per-click) system. Is this the case? Maybe, maybe not.
Perhaps both of these reasons came into play. Perhaps Google execs thought they could
1) improve the quality of their rankings,
2) remove many of the 'spammy' low-quality sites
3) because of #2, increase AdWords revenues and
4) because of better results and more revenue have a better chance at a successful IPO.
Sadly, for Google, this plan had a detrimental flaw.
What Google Should Do
While there are positives that have come from this OOP filter, the filter needs to be adjusted. Here is what Google should do:
1. Post a communiqué on its web site explaining in as much detail as they are able what they have done and what they are doing to fix it;
2. Reduce the weight of OOP;
3. If the OOP is indeed a static penalty that can only be removed by a human, change it to a dynamic penalty that is analyzed and assessed with each major update; and
4. Establish an appeal process through which site owners which feel they are following all rules and have quality content can have a human (or enlightened spider) review their site and remove the OOP if appropriate.
When this recent update broke on November 15, webmasters clamored in the thousands to the industry forums such as webmasterworld.com. The mis-update was quickly titled "Florida Update 2003" and the initial common wisdom was that Google had made a serious mistake that would be fixed within 3-4 days and everyone should just stay put and wait for Google to 'fix itself.' While the rankings are still dancing, this fix has yet to come. High quality sites with lots of good content that have done everything right are being severely penalized.
If Google does not act quickly, it will soon lose market share and its reputation as the provider of the best search results. With Yahoo's recent acquisition of Inktomi, Alltheweb/FAST, and Altavista, it most likely will soon renege on its deal to serve Google results and may, in the process, create the future "best search engine on the 'net." Google, for now, has gone bananas in its recent meringue, and it may soon be spoiled rotten.
The idea that Google is randomizing results is totally absurd!
If that were the case then each data centre would show a different set of results for any give query.
I don't agree with you. Perhaps G is using some pseudo-random key based on date or something similar, so the results change for one day to another but not from one datacenter to another. And there are many other ways to pseudo-randomize.
I don't know if G is randomizing the SERPs, but I can assure that it is technically possible.
In some cases the randomizing is good for the web, but it is dangerous. If the results in SERPs become random, no more SEO will be needed, so nobody will improve and optimize the pages, nor exchange links, etc. So, if G abuses of this technique (I'm not sure even if they are at least using it), it will hurt the global quality of the web.
Greetings,
Herenvardö
will bring you results from the UK, Australia and many non related results.
While I appreciate that it is technically possible to pseudo-randomize, the results aren't changing on a daily basis. They have been totally stable for days now. 2 datacentres do show a slight variance, but always the same variance.
Diversification yes - a need to fine tune the broad matching criteria - yes. But no randomization.
[edited by: merlin30 at 9:48 am (utc) on Nov. 28, 2003]
in their charming and random way
Apologies for self-quoting, but I was being ironic, and not suggesting the results are genuinely random - they just look it!
But...
I agree you couldn't randomise the data in the datacentres this rapidly - but surely you can randomise the effects of a filter quite easily - and there's now little dispute a new filter (or filters) is in place.
Certainly my position changes almost hourly on a UK search.
[edited by: superscript at 9:49 am (utc) on Nov. 28, 2003]
Dave
I know, Dave - the users have to learn to search properly, right? :)
Regarding regional SERPS, the Australian ones look stable to me but the smaller index exposes glaringly poor results, particularly for adult terms. Sticky me for examples if you like - they boggle the mind.
I certainly agree that no page has the right to be at #1, and as Google spiders the web faster and recalculates relationships faster then results will change accordingly - in fact, I argued that case in a previous thread. But that is quite seperate from Google providing random results.
It matters not *how* it happens, only that is does happen.
Dave
Is anyone seeing examples of 'search in country only' searches giving unusual results post Florida?
More specifically - and this might clear it up - are there any forum members in Greece and Italy seeing more UK results? We hardly had a single sale to these countries before the Florida massacre, but now you would think we'd been running a TV campaign over there ;)
Where he disputes the commercial filter / SEO filter idea (which has never made much sense) and blames broad-matching instead. Well worth a read.
Its more about what other pages are saying, in general terms about your page. PR may now be used in a more indirect way - its not necessary for your site to have a high PR, but if a site with high PR links to and talks about your page in terms broadly relevent to the query, then you will have a chance of being on the radar. This would favour sites that have naturally garnered a rich set of back links indicating the many different qualities that your page has. SEOd sites have managed to garner a set of backlinks that talk about their page in very narrow terms - a few keyphrases. Doesn't really tell Google much about your page in general. Of course - if your a page doesn't really have very many useful qualities it's not going to score well.
I do think that the algorithm will be fine tuned in the coming months, but I think the trend is clear.
Now all I've got to do is sneak a few references about farmyard animals onto my widget site to ensure a broad match! :)
Let's have a go:
'Here are our latest company premises, located deep in the countryside, surrounded by farms. Meat our new secretary, Helen Goat, and our new finance director, Alan Cow.'
'<a href= Alan Cow's profile </a>' (note use of plural)
Alan Cow joined widget and co from his previous job as a farmer with James Rooster (Pigs division) and co'
etc. etc.
Big smile!
Are the links pointing to your site talking about sheep?
That's an interesting one, but if I have control of the links, I could insert the word sheep I suppose.
'Also see our latest range of widgets by Sheep & Co'
Then,
'Dear Customer,
We are afraid we are currently out of stock of Sheep and co. products'
All hypothetical of course!
Or how about:
'This company gives 2% of its profits to the welfare of farmyard animals'
<edit>I'd just better point out that I am not this unethical, I'm merely pointing out that there are pitfalls to all algo changes!</edit>
BTW, I'm not being deliberately obtuse, just trying to investigate how the mechanics of broad matching and categorising concepts might pan out.
Conclusion 1: Google is losing some honest visitors
Conclusion 1: Since mostly people find crap on initial SERPs, they are showing interest in CPC ads. with a hope to get atleast a related informtion though 100% commercial.
Anybody else noticing this stastical change?
how important a sheep farming site would it be, compared to the other sheep farming sites!
Yes, it's all hypothetical - as you say, I might accidentally get myself classed as a site about sheep, and get millions of hits from people trying to buy mutton on my widget site!
What is amazing here in the UK, is that on my previous main keywords, my competitors and myself have all virtually disappeared. But none of us are appearing in Adwords either. In my case this is because I refuse to waste money on it - but where are all my previous high ranking competitors - why aren't they appearing in Adwords?
It looks like they've all been booted off by huge companies, several of which are in the USA - and it doesn't make any sense for a UK customer to buy from the States. The postage cost, long delivery time and import taxes mean that we in the UK go elsewhere when we see dollar prices on a shop page.
Crazy!
<edit: We have no problem with USA sites - your prices are generally cheaper than ours. But import taxes and crossing the pond make it nonsensical to have US shops showing up in UK adwords - unless it is something really specialist we can't obtain here. (I've bought some optical equipment from the USA in the past, and the odd sheep!) </end edit>
[edited by: superscript at 2:30 pm (utc) on Nov. 28, 2003]
Do you get found for more specific or secondary phrases?
Remember that this algo change is new so users looking for commercial sites on Google might suddenly be taken aback by the broader range of topics now presented on their familiar "best/cheapest/latest widget" search terms. Over time, and perhaps this is what Google is caluculating, users will be more specific about what they are looking for. In which case, for those sites which can be found on more specific terms, traffic would start to return.
Google may be training users to search more accurately!
It is the removal of this possibility to gradually refine a search that will damage Google.
Google may be training users to search more accurately!
One of the features that Google built it's early reputation on was the "I'm feeling lucky" button... the whole point was you didn't have to enter a hyper-qualified search query... Google would 'magically' return something pretty close to what you needed even if you only typed in two words and hit the right-hand button.
Something AV couldn't come close to...
Is this the end of the Lucky Era?