Forum Moderators: open

Message Too Old, No Replies

What The Early Research is Showing – Florida Update 2003

an analysis and aggregate of the current post-Florida update best practices

         

ryanallis1

9:14 am on Dec 3, 2003 (gmt 0)



I would welcome any comments and discussion on the following article (all URLs and specific keywords have been removed) that analyzes the current state of the Google update and suggests certain steps to take for both webmasters and Google...

Thank you,
Ryan Allis

On November 15, 2003, the SERPs (Search Engine Result Pages) in Google were dramatically altered. Although Google has been known to go through a reshuffling (appropriately named a Google Dance) every 2 months or so, this 'Dance' seems to be more like a drunken Mexican salsa that its usual conservative fox-trot.

Most likely, you will already know if your web site has been affected. You may have seen a significant drop-off in traffic around Nov. 15. Three of my sites have been hit. While one could understand dropping down a few positions, since November 15, the sites that previously held these rankings are nowhere to be found in the top 10,000 rankings. Such radical repositionings have left many mom-and-pop and small businesses devastated and out of luck for the holiday season. With Google controlling approximately 85% of Internet searches, many businesses are finding a need to lay off workers or rapidly cancel inventory orders. This situation deserves a closer look.

What the Early Research is Showing

From what early research shows, it seems that Google has put into place what has been quickly termed in the industry as an 'Over Optimization Penalty' (OOP) that takes into account the incoming link text and the on-site keyword frequency. If too many sites that link to your site use link text containing a word that is repeated more than a certain number of times on your home page, that page will be assessed the penalty and either demoted to oblivion or removed entirely from the rankings. In a sense Google is penalizing sites for being optimized for the search engines--without any forewarning of a change in policy.

Here is what else we know:

- The OOP is keyword specific, not site specific. Google has selected only certain keywords to apply the OOP for.

- Certain highly competitive keywords have lost many of the listings.

How to Know if Your Site Has Been Penalized

There are a few ways to know if your site has been penalized. The first, mentioned earlier, is if you noticed a significant drop in traffic around the 15th of November you've likely been hit. Here are ways to be sure:

1. Go to google.com. Type in any search term you recall being well-ranked for. See you site logs to see which terms you received search engine traffic from. If your site is nowhere to be found it's likely been penalized.

2. Type in the search term you suspect being penalized for, followed by "-dkjsahfdsaf" (or any other similar gibberish, without the quotes). This will remove the OOP and you should see what your results should be.

3. Or, simply go to www.**** to have this automated for you. Just type in the search term and see quickly what the search engine results would be if the OOP was not in effect. This site, put up less than a week ago, has quickly gained in popularity, becoming one of the 5000 most visited web sites on the Internet in a matter of days.

The Basics of SEO Redefined. Should One De-Optimize?

Search engine optimization consultants such as myself have known for years that the basics of SEO are:

- put your target keyword or keyphrase in your title, meta-tags, and alt-tags
- put your target keyword or keyphrase in an H1 tag near the top of your page
- repeat your keyword or keyphrase 5-10 times throughout the page
- create quality content on your site and update it regularly
- use a site map (linked to from every page) that links to all of your pages
- build lots of relevant links to your site
- ensure that your target keyword or keyphrase is in the link text of your incoming links

Now, however, the best practices for keyword frequency and link text will likely trigger the Google OOP. There is surely no denying that there are many low quality sites have used link farms and spammed blog comments in order to increase their PageRank (Google's measure of site quality) and link popularity. However, a differentiation must be made from these sites and quality sites with dozens or hundreds of pages of informational well-written content that have taken the time to properly build links.

So if you have been affected, what can you do? Should one de-optimize their site, or wait it out? Should one create one site for Google and one for the 'normal engines?' Is this a case of a filter been turned on too tight that Google will fix in a matter of days or something much more?

These are all serious questions that no one seems to have answers to. At this point we recommend making the following changes to your site if, and only if, your rankings seem to have been affected:

1. Contact a few of your link partners via email. Ask them to change the link text so that the keyword you have been penalized for is not in the link text or the keyphrase is in a different order than the order you are penalized for.

2. Open up the page that has been penalized (usually your home page) and reduce the number of times that you have the keyword on your site. Keep the number under 5 times for every 100 words you have on your page.

3. If you are targeting a keyphrase (a multiple-word keyword) reduce the number of times that your page has the target keyphrase in the exact order you are targeting. Mix up the order. For example, if you are targeting "Florida web designer" change this text on your site to "web site designer in florida" and "florida-based web site design services."

It is important to note that these 'de-optimization' steps should only be taken if you know that you have been affected by the Google OOP.

Why did Google do this? There are two possible answers. First, it is possible that Google has simply made an honest (yet very poor) attempt at removing many of the low-quality web sites in their results that had little quality content and received their positions from link farms and spamdexing. The evidence and the search engine results point to another potential answer.

A second theory, which has gained credence in the past days within the industry, is that in preparation for its Initial Public Offering (possibly this Spring), Google has developed a way to increase its revenue. How? By removing many of the sites that are optimized for the search engines on major commerical search terms, thereby increasing the use of its AdWords paid search results (cost-per-click) system. Is this the case? Maybe, maybe not.

Perhaps both of these reasons came into play. Perhaps Google execs thought they could

1) improve the quality of their rankings,
2) remove many of the 'spammy' low-quality sites
3) because of #2, increase AdWords revenues and
4) because of better results and more revenue have a better chance at a successful IPO.

Sadly, for Google, this plan had a detrimental flaw.

What Google Should Do

While there are positives that have come from this OOP filter, the filter needs to be adjusted. Here is what Google should do:

1. Post a communiqué on its web site explaining in as much detail as they are able what they have done and what they are doing to fix it;

2. Reduce the weight of OOP;

3. If the OOP is indeed a static penalty that can only be removed by a human, change it to a dynamic penalty that is analyzed and assessed with each major update; and

4. Establish an appeal process through which site owners which feel they are following all rules and have quality content can have a human (or enlightened spider) review their site and remove the OOP if appropriate.

When this recent update broke on November 15, webmasters clamored in the thousands to the industry forums such as webmasterworld.com. The mis-update was quickly titled "Florida Update 2003" and the initial common wisdom was that Google had made a serious mistake that would be fixed within 3-4 days and everyone should just stay put and wait for Google to 'fix itself.' While the rankings are still dancing, this fix has yet to come. High quality sites with lots of good content that have done everything right are being severely penalized.

If Google does not act quickly, it will soon lose market share and its reputation as the provider of the best search results. With Yahoo's recent acquisition of Inktomi, Alltheweb/FAST, and Altavista, it most likely will soon renege on its deal to serve Google results and may, in the process, create the future "best search engine on the 'net." Google, for now, has gone bananas in its recent meringue, and it may soon be spoiled rotten.

Chndru

8:30 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Good point, rcjordan. I am still of the opinion that G looks at the web as a collection of pages rather than sites, (except for very few cases). So Hissingsid's approach could very well be applied for webpages

jimbeetle

8:33 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



filtered and unfiltered within the same (large) site

Gut feeling, but I'd go with RC on this. It would be the page popping up in the before-filtered SERPs that would trigger the filter.

Hissingsid

7:52 am on Dec 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Good site + any term = apply normal algo

Scratch that one. I have filtered and unfiltered within the same (large) site.

Inadvertently I've used the word site here. I agree with others here that page is correct. I may have used it elsewhere in my little essay but was not trying to make a distinction between pages and sites or suggest that this method would apply to sites rather than pages. If I am anywhere near right in this it would make sense that the BadRank flag would be stored by page. It would not fit the "democratic" theory of Google if it were transferred to the whole site because some pages within that site were flagged. Also it would take another basically unnecessary computation to work out which pages constituted a site and which level of BadRank to apply to the other pages. They would pick up some BadRank anyway by simple transference in a similar way to how PageRank is picked up.

I would say that it would be probable that some pages within a site would not pick up enough BadRank to be flagged even if some were flagged. This would encourage Google engineers by suggesting that the system was applied in a fair way.

If I change the suggestion of how the flag works to this

Bad page + filtered term searched for = apply null algo
Bad page + non-filtered term = apply normal algo
Good page + any term = apply normal algo

Now the assumptions are not broken and the hypothesis is not disproved by "I have filtered and unfiltered within the same (large) site."

Sid

kelleybelly

8:30 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



My site too disappeared last night from the google index, but only my home page. ALl of my other 2500 + pages are still index . My home page is gone. I have no idea why. It still shows a pr5 but when you search for it in google it says no information about this site exists, yet there are still 2500 pages from my website - everything else except for the index page. Does anybody know why or how this can happen?

bekyed

10:33 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



send me a sticky kelley of the url if you like

Bek.

mcavic

3:07 am on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I don't know why, but if's a PR5, I bet it'll be back in a few days.

dazzlindonna

3:10 am on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



when searching for your domain, have you tried both yourdomain.com and www.yourdomain.com . many have noticed that one shows the no info message and one shows the listing.

kelleybelly

4:10 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



Hi Yes I tried that and I am no where to be found. Its the strangest thing.

benc007

10:05 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



I have a client with the same problem. Thousands of pages indexed but home page and a couple others do not appear in the SERPS.

n fact, they are no where in the SERPS for their keyword terms. AND they have PR6 and 500+ backlinks!

ILLstyle

10:33 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



This type of thing happend to one of my sites.
but they have started to come back.

jim_w

10:42 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As near as I can tell, and there will a bunch here that disagree with me because they have G’s name printed on their underwear, (of course I shouldn’t talk, I have M$ printed on mine), G is AFU, (all fouled up). They are trying to get pages from my site that hasn’t been there for 6 months. I redirected those pages with 301’s, and literally, every other bot has gotten the idea, expect G.

Gbot has crawled pages that I don’t have links to but I do have adsense ads on, so either the G toolbar, that I am getting ready to uninstall, has called home with the URL, or MediaPartners bots has put the URL into the G database.

If it is the latter, then we have one bot using robots.txt like it should and one bot using it backwards. That’s just an accident waiting to happen. I have never seen such bull stuff from an alleged professional company. Poor, poor design.

Like all databases, put garbage in, get garbage out.

nmjudy

11:32 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



You would think that Joe public with the minimum of search skills are scratching their heads over the whereabouts of so many lost sites.

Just stumbled across another story about Google November changes and IPO speculation:

[nypost.com...]

nakulgoyal

3:54 am on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



That was nice. Thanks for the link.

martinibuster

4:01 am on Dec 5, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Yeah, old news... that's from last week (dateline Nov. 27).

People with a subscription to the back room have discussed it and moved on.

:) Y

Superficial story... SEO dude quoted in the article was useless because he couldn't stop posturing long enough to say anything of substance about what was going on...

The BBC article from this morning (or yesterday?) has more substance and is better written.

Hey, deanril, it doesn't get more mainstream than a TV News piece on the BBC and a companion article online.

1milehgh80210

4:08 am on Dec 5, 2003 (gmt 0)

10+ Year Member



Maybe google will spin this to their advantage PR wise (public relations)
I thought the title of the article was interesting.
G is just trying to get rid of the 'freeloaders'

What's the old saying?
Gas, grass or _ss- nobody rides for free!

HughMungus

5:35 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Anybody else notice the contextual ads that came with this article? Irony.

sem4u

5:41 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Anybody else notice the contextual ads that came with this article?

100,000 potential customers for $139.99 Hmmmmm.....

Very ironic!

Chndru

5:42 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>scratching their heads over the whereabouts of so many lost sites.

To be honest, there are so many out there, that the lost sites wont be missed at all. What's so unique about the sites that sell similar products?

Kirby

6:00 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>To be honest, there are so many out there, that the lost sites wont be missed at all. What's so unique about the sites that sell similar products?

Interesting point and probably Google's attitude as well. It also justifies the authority/hub sites that have replaced them. Great cover to hide behind while selling Adwords to the lost.

superscript

3:26 pm on Nov 27, 2003 (gmt 0)



Has any one noticed strange regional effects in their logs (apart from apparently irrelevant search queries?)

My logs now contain a disproportionate number of referrals from continental Europe, and my actual sales outside the UK, for what was previously a high ranking UK site - but sadly no longer - have gone up 50 fold since the Florida update.

The initial hypothesis is that although commercial results are less relevant, they are now being dished up in their charming and random way, all over the place!

dmorison

7:13 am on Nov 28, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Randomised results, particularly for commercial searches have been predicted for months now.

There is no way Google should be serving up the same page in #1 position time and time again, when there are potentially thousands of other online business equally relevant to the query.

Given the "power" that Google has over where people go on the net, I think it is a perfectly responsible move.

Sure, the person that used to be a rock solid #1 for "expensive blue widgets online" will likely be a bit miffed, but at least it will save Google from numerous class action law suits from the owners of sites #2 - #999.

1milehgh80210

7:47 am on Nov 28, 2003 (gmt 0)

10+ Year Member



Randomised results". Is'nt that how g got to be the search leader? )

Dave_Hawley

7:54 am on Nov 28, 2003 (gmt 0)



Now that Google has the continuous updates the SERPS will rarely stay the same for long.

Dave

MarkWolk

7:59 am on Nov 28, 2003 (gmt 0)

10+ Year Member



"Randomized results" - great! One of my friends owns a restaurant I will suggest him to serve randomized dinners: you'll never know exactly the composition and the order in which you will receive your courses, and they will be different day after day. Surely consumers will enjoy. Lawyers even more.

Dave_Hawley

8:08 am on Nov 28, 2003 (gmt 0)



Comparing Google to a restaurant? Try as I might I cannot see the similarities. I'm sure someone will point them out though :o)

Dave

1milehgh80210

8:15 am on Nov 28, 2003 (gmt 0)

10+ Year Member



LOL, the results on the left side of the page may spin like a roulette wheel, but the ones on the right stay remarkably? stable!

redzone

8:18 am on Nov 28, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



G wasn't the first to randomize results (If in fact, that's what they are doing)...

AV was doing it back in '99, long before Google made it into Webster's.... :)

merlin30

8:58 am on Nov 28, 2003 (gmt 0)

10+ Year Member



The idea that Google is randomizing results is totally absurd!

If that were the case then each data centre would show a different set of results for any give query. Each server will run its own instance of the query algorithm, and if that included a strong random number generator then each instance would almost certainly generate a different key with which to randomize the results.

curlykarl

9:17 am on Nov 28, 2003 (gmt 0)

10+ Year Member



The idea that Google is randomizing results is totally absurd!

I dont know about that, have you checked through your logs.

I have been checking referals , I follow back the phrase and path used to find me in Google, sometimes my page is there other times it isn't?

I'd say thats fairly random.

Karl :)

yetanotheruser

9:19 am on Nov 28, 2003 (gmt 0)

10+ Year Member



Randomised results, particularly for commercial searches have been predicted for months now.

Really? I don't see the results changing that much - they seem pretty stable, just odd!

superscript,
sorry - haven't answered your post (blush) but very interesting - one of our sites get's quite high foreign traffic for foreign searches, but I'll have a look and see if it's shifted and get back... the only shift I've noticed is down!

This 526 message thread spans 18 pages: 526