Forum Moderators: open

Message Too Old, No Replies

What The Early Research is Showing – Florida Update 2003

an analysis and aggregate of the current post-Florida update best practices

         

ryanallis1

9:14 am on Dec 3, 2003 (gmt 0)



I would welcome any comments and discussion on the following article (all URLs and specific keywords have been removed) that analyzes the current state of the Google update and suggests certain steps to take for both webmasters and Google...

Thank you,
Ryan Allis

On November 15, 2003, the SERPs (Search Engine Result Pages) in Google were dramatically altered. Although Google has been known to go through a reshuffling (appropriately named a Google Dance) every 2 months or so, this 'Dance' seems to be more like a drunken Mexican salsa that its usual conservative fox-trot.

Most likely, you will already know if your web site has been affected. You may have seen a significant drop-off in traffic around Nov. 15. Three of my sites have been hit. While one could understand dropping down a few positions, since November 15, the sites that previously held these rankings are nowhere to be found in the top 10,000 rankings. Such radical repositionings have left many mom-and-pop and small businesses devastated and out of luck for the holiday season. With Google controlling approximately 85% of Internet searches, many businesses are finding a need to lay off workers or rapidly cancel inventory orders. This situation deserves a closer look.

What the Early Research is Showing

From what early research shows, it seems that Google has put into place what has been quickly termed in the industry as an 'Over Optimization Penalty' (OOP) that takes into account the incoming link text and the on-site keyword frequency. If too many sites that link to your site use link text containing a word that is repeated more than a certain number of times on your home page, that page will be assessed the penalty and either demoted to oblivion or removed entirely from the rankings. In a sense Google is penalizing sites for being optimized for the search engines--without any forewarning of a change in policy.

Here is what else we know:

- The OOP is keyword specific, not site specific. Google has selected only certain keywords to apply the OOP for.

- Certain highly competitive keywords have lost many of the listings.

How to Know if Your Site Has Been Penalized

There are a few ways to know if your site has been penalized. The first, mentioned earlier, is if you noticed a significant drop in traffic around the 15th of November you've likely been hit. Here are ways to be sure:

1. Go to google.com. Type in any search term you recall being well-ranked for. See you site logs to see which terms you received search engine traffic from. If your site is nowhere to be found it's likely been penalized.

2. Type in the search term you suspect being penalized for, followed by "-dkjsahfdsaf" (or any other similar gibberish, without the quotes). This will remove the OOP and you should see what your results should be.

3. Or, simply go to www.**** to have this automated for you. Just type in the search term and see quickly what the search engine results would be if the OOP was not in effect. This site, put up less than a week ago, has quickly gained in popularity, becoming one of the 5000 most visited web sites on the Internet in a matter of days.

The Basics of SEO Redefined. Should One De-Optimize?

Search engine optimization consultants such as myself have known for years that the basics of SEO are:

- put your target keyword or keyphrase in your title, meta-tags, and alt-tags
- put your target keyword or keyphrase in an H1 tag near the top of your page
- repeat your keyword or keyphrase 5-10 times throughout the page
- create quality content on your site and update it regularly
- use a site map (linked to from every page) that links to all of your pages
- build lots of relevant links to your site
- ensure that your target keyword or keyphrase is in the link text of your incoming links

Now, however, the best practices for keyword frequency and link text will likely trigger the Google OOP. There is surely no denying that there are many low quality sites have used link farms and spammed blog comments in order to increase their PageRank (Google's measure of site quality) and link popularity. However, a differentiation must be made from these sites and quality sites with dozens or hundreds of pages of informational well-written content that have taken the time to properly build links.

So if you have been affected, what can you do? Should one de-optimize their site, or wait it out? Should one create one site for Google and one for the 'normal engines?' Is this a case of a filter been turned on too tight that Google will fix in a matter of days or something much more?

These are all serious questions that no one seems to have answers to. At this point we recommend making the following changes to your site if, and only if, your rankings seem to have been affected:

1. Contact a few of your link partners via email. Ask them to change the link text so that the keyword you have been penalized for is not in the link text or the keyphrase is in a different order than the order you are penalized for.

2. Open up the page that has been penalized (usually your home page) and reduce the number of times that you have the keyword on your site. Keep the number under 5 times for every 100 words you have on your page.

3. If you are targeting a keyphrase (a multiple-word keyword) reduce the number of times that your page has the target keyphrase in the exact order you are targeting. Mix up the order. For example, if you are targeting "Florida web designer" change this text on your site to "web site designer in florida" and "florida-based web site design services."

It is important to note that these 'de-optimization' steps should only be taken if you know that you have been affected by the Google OOP.

Why did Google do this? There are two possible answers. First, it is possible that Google has simply made an honest (yet very poor) attempt at removing many of the low-quality web sites in their results that had little quality content and received their positions from link farms and spamdexing. The evidence and the search engine results point to another potential answer.

A second theory, which has gained credence in the past days within the industry, is that in preparation for its Initial Public Offering (possibly this Spring), Google has developed a way to increase its revenue. How? By removing many of the sites that are optimized for the search engines on major commerical search terms, thereby increasing the use of its AdWords paid search results (cost-per-click) system. Is this the case? Maybe, maybe not.

Perhaps both of these reasons came into play. Perhaps Google execs thought they could

1) improve the quality of their rankings,
2) remove many of the 'spammy' low-quality sites
3) because of #2, increase AdWords revenues and
4) because of better results and more revenue have a better chance at a successful IPO.

Sadly, for Google, this plan had a detrimental flaw.

What Google Should Do

While there are positives that have come from this OOP filter, the filter needs to be adjusted. Here is what Google should do:

1. Post a communiqué on its web site explaining in as much detail as they are able what they have done and what they are doing to fix it;

2. Reduce the weight of OOP;

3. If the OOP is indeed a static penalty that can only be removed by a human, change it to a dynamic penalty that is analyzed and assessed with each major update; and

4. Establish an appeal process through which site owners which feel they are following all rules and have quality content can have a human (or enlightened spider) review their site and remove the OOP if appropriate.

When this recent update broke on November 15, webmasters clamored in the thousands to the industry forums such as webmasterworld.com. The mis-update was quickly titled "Florida Update 2003" and the initial common wisdom was that Google had made a serious mistake that would be fixed within 3-4 days and everyone should just stay put and wait for Google to 'fix itself.' While the rankings are still dancing, this fix has yet to come. High quality sites with lots of good content that have done everything right are being severely penalized.

If Google does not act quickly, it will soon lose market share and its reputation as the provider of the best search results. With Yahoo's recent acquisition of Inktomi, Alltheweb/FAST, and Altavista, it most likely will soon renege on its deal to serve Google results and may, in the process, create the future "best search engine on the 'net." Google, for now, has gone bananas in its recent meringue, and it may soon be spoiled rotten.

Powdork

9:52 am on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thats an excellent post Claus and it obviously took some time. I would disagree with "A better (or more efficient) handling of duplicate issues". I think the opposite is true and combined with "Upping of inurl" is producing some dramatic results.

And the filter/feature is being handled by a 900lb gorilla, when it should be handled by a surgeon.

claus

10:32 am on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Off-topic, but posted, as a sticky just made me aware that a post i made in this thread had offended some people

>>doing Google's job

I'm sorry about that... i didn't intend to sound arrogant. It was quite literal, meaning that Google should really be expected to explain such things for us webmasters so that we didn't have to go through all that effort to find out for ourselves.

I don't think Google is very good at communicating and explaining what they're doing - perhaps it's for fear of being exploited and that's understandable, but with changes as large as these last ones they should really be posting press releases and/or rewriting their help section. Google should plainly inform me so that i didn't have to use my precious time on trying to understand their business in stead of my own.

For the SEO part of my business (which is only a part, believe it or not), i don't mind their secrecy. I don't want them to disclose information about what makes one site rank higher than another (sidenote: that would be the end of SEO) but i do want full information about how their engine works for me as a searcher, so that i don't think i'm doing one thing, when in fact i'm doing another. Also, as a webmaster, i appreciate their "do and don't" guidelines, but i'd like to see them inforced as well as published.

So, to sum it up - i'm really sorry if i sounded arrogant, that was not my intention. I only intended to state that i was not very satisfied with the way Google had done their own job in explaining things.

/claus

Please Be Gentle

12:30 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



Just a quick note. "Working Lunch" had an update on yesterday's Google segment and they said that they had received a lot of complaints from people stating they were too harsh on Google. They received a few negative comments from small businesses ( a handbag and wedding business) who had experienced problems.However the main feedback was either tips from surfers (e.g. use co.uk in search etc. and using Adwords) and compliments from users saying they thought Google was great. This story may run and run...
Kindest Regards
PBG

superscript

2:07 pm on Dec 5, 2003 (gmt 0)



.

[edited by: superscript at 3:35 pm (utc) on Dec. 5, 2003]

Chndru

2:17 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Google should plainly inform me so that i didn't have to use my precious time on trying to understand their business in stead of my own.

As far as algo goes, G's algo is one of the most exposed one, with numerous papers and patents etc. And, simply, what they tinker with their algo is no one's business but theirs.

Miop

2:20 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



I like exactseek's approach, that you can get an exact location in the serps, add useful descriptions and that you can even request your page to be indexed within 7 days (and they actually do it!)
If all search engines were like that, there would be no need to spam or spend hours mulling over seo. I guess that if exactseek ever became as big as google, one might not expect to see such an open and flexible approach.

MetropolisRobot

2:42 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



Interesting point made by Claus on perhaps the reduced emphasis on freshbot/deepfreshbot entries. My site is partially dynamic and partially static (generated dynamically on a rota). I have always done well with freshbot because the site is well "refreshed" on a daily basis. However, I have been concerned that that Google and others might see this as a sign of over-optimizing of material, when in fact it is really just to change featured widgets and the showcase widgets.

Making me think.

claus

2:50 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Chndru, i did state that i did not mind secrecy as far as SEO was concerned. A secret ranking algo and/or changes to this is no problem. Wearing the SEO hat, i only welcome that. I might not even have spoken for myself there, as i think i personally have at least some understanding of these things now - yet a lot of others haven't...and of course, i may be entirely wrong myself

Miop

3:23 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



One other thing I have notices...I am in the UK and my traffic from uk users has increased. I don't know if this was an intended effect or has nothing to do with it.
My uk traffic is only 15%...I wish it was more but don't know what I can do about it.

superscript

3:32 pm on Dec 5, 2003 (gmt 0)



Miop,

I'm in the UK too. I am now getting so many hits on sub-pages and with longer search strings that my sales look much the same. They may even get better than before because my competitor's index pages, like mine, are off the radar on a simple search - and my site has more content.

It still riles me though, that a simple, generic KW1 KW2 throws up so much junk, when what was listed before was really pretty good.

I've spotted a new type of spam though - which may have helped some sites through Florida. I think I can see what is behind the technique, so I'm testing it out in a legitimate and open way.

anime_otaku

4:20 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



What I notice is google gives additional weight to terms not being used exactly but seperately (if your target phraise is "green widgets" have links/text on page with anything with green OR widgets but not just "green widgets". Worse yet is exactly using the link/text "green widgets".)

skirope

4:47 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



---------
only way to tell if you are a victim of florida or if you are really penalised is doing a google search with syntax
site:example.com keywords
replace 'keywords' with what terms you are expecting your 'penalised' pages should show up on. replace example.com with your domain. if you find them, oop penality, if not found at all, you're in trouble. ;-(

--------

But if I were penalized wouldn't my homepage be taken out too? What do you think I did wrong on my inner pages? I am squeaky clean, no hidden text, nothing shady, just parsed the site. Could it be because of no 301 redirect?

skirope

4:53 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



Can someone summarize in one paragraph the Florida Update? I read dozens of pages but am even more confused please address:

1.)What google is apparently penalizing for now.
2.)What has the most SEO weight now.

Thx

Chndru

4:56 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>someone summarize in one paragraph the Florida Update

This is what Peter Norvig, Google's Director of Quality, has said. Try:
[webmasterworld.com...] (msg #:164 by James_Dale) has a link to it.

jrokesmith

4:59 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



Claus, excellent post. You might want to post that in a new thread so it doesn't get buried. The data that I am seeing for many terms and sites I use as benchmarks seems to fit with the points you mentioned. Haven't really looked at duplicates though.

Kirby

5:06 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Excellent post claus. Now if you had been displaced by authorities/hubs (and i agree about the news sources as well), what would you do? Is it necessary to now appear to be one of these as well in these segments most displaced by these types of sites?

While I certainly intend to add more content, the fact of the matter is the primary thing my visitor wants to see is housing inventory. That where 98% of the leads are generated. Adding pages and pages of content on all aspect of housing is doable if that is what it takes to satisfy google, but what happened to "build it for your user"?

killipso

5:17 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



GG you still taking spam reports if so drop me a e-mail not sure if it was you or my buddy at google that took care of my issue I had with a spammer. Either way he is gone. So if it was you Thank You.
Now I have some guy that is spamming the worst I have ever seen on google. It's so bad its funny that google did a update and he is #1 everywhere.
Heres the jist of it.
33 domains all have hidden css linking keywords and text they are all number 1.
All interlinked and spammed like crazy.
If anyone would like to have a list of these domains e-mail me. vYou won't belive it. You have to open the domains up in frontpage or what ever for the spam to appear. All though most of the pages are cover with spam all ready.
Dan

superscript

5:19 pm on Dec 5, 2003 (gmt 0)



What's the current thinking

[i.e. in the last 5 minutes ;) ]

on inbound link "penalties"?

('penalties' in quotes because I don't believe they exist -just relative downgrading of importance)

The reason I ask is that many large sites are auto generated to some extent - might a highly targetted inbound link coming from all 700 pages of a *single* site be considered a "penalty"?

Kirby

5:26 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>The reason I ask is that many mega sites are auto generated to some extent - might a highly targetted inbound link coming from all 700 pages of a *single* site be considered a "penalty"?

Doubtful. I know of several sites (not mine) that link from every one of their pages to each others home page. Very targeted anchor text as well. They have all survived being displaced by the authority/hubs while their pre-florida top 10 competitors are buried.

newwebster

5:28 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



"What I notice is google gives additional weight to terms not being used exactly but seperately (if your target phraise is "green widgets" have links/text on page with anything with green OR widgets but not just "green widgets". Worse yet is exactly using the link/text "green widgets".) "

Hmmm, very good observation. On my 3 keyword phrase which targets the first 2 as the most competitive phrase the last 2 keywords I have noticed a ranking boost as seperate terms. In other words. Both of these terms are competive terms as well on there own , but I am targeting the first 2 as a phrase. All three terms are in my anchor text in what used to be the correct sequence for ranking. I have been dropped like everone else on 3 word phrase and the first 2 phrase. An experiment in seprating all of the terms as seperate anchor text may work. I have also noticed that there is a mixrue(sites seem to be contain info on one or the other but rarely both together) of these first 2 keywords in the current serps as if Google is ranking both which would support the broad matching. Just do not know for sure right now. I am going to be patient and wate to after the next deep crawl and continue with observations before I make any decisions.

Miop

5:44 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



The number one site in my competitive field only appears to have 1 line of spiderable text, and that is the title. In the title, the keywords each appear once. That's it.

c1bernaught

6:40 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



rfgdxm1:

Interesting reply....

Do you suppose that the stories now hitting the press have simply materialized from thin air? They must simply be coming from disgruntled SEO professionals only. I mean THAT would be a BIG story... right?

Do you suppose that all of the small businesses that are affected have not voiced their opinion? Is it possible that the full effects of this have not been realized by joe user yet? Any possibility that joe wouldn't immediately turn to the mainstream press when they see poor serps? I think it's possible that many "joe users" are seeing poor serps and will respond by searching elsewhere... just my opinion of course.... I am allowed that... right?

Oh, I want to add that it seems that you are severely outnumbered in your thinking that Google has no problems....

But, of course, it's MY reality that is skewed.... right?

Back on topic...

I'm seeing a continuing slide in the serps for large cities. In checking yesterday there were several large cities with what I have come to expect as "relevant" serps. Today the serps for those cities have changed. Apparantly the cities are being hit by size perhaps and not by alphabet as was suggested earlier. It seems that the filter is rolling across the largest cities first. This seems to be the case for only certain KW's though. The sites that were there yesterday have been replaced by directories, educational pages that mention the topic, sites that are mostly repeated KW's and a few big players...

Very odd.... I wonder how far this will go? Anyone else seeing this?

Kirby

7:00 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes. US cities from Boston to Ventura have had the serps overtaken by directory type sites, newspapers, etc., and now seeing more .edu sites with just simple references to the kw.

Forget well written, clear and concise content. What works now for these searches is a free-for-all cornucopia of info and outbound links, all with no particular focus.

Trawler

7:02 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



c1bernaught>
Back on topic...

I'm seeing a continuing slide in the serps for large cities.

Very odd.... I wonder how far this will go? Anyone else seeing this?

____

I can confirm that. I watch the travel related serps about 12 hrs a day. The big cities were hit first, and now the smaller ones are being taken out one by one.

The only pattern I can see is size of city seems to mater.

Chalupee

7:06 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



some of you must have tried this!

Type in: search engine
at g=oogle....

and seee what you get!

Cheers.

Chndru

7:07 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>size of city seems to mater

If only, G had the power to see such things out a simple webpage. Nope, you are vastly over-estimating the things :)

c1bernaught

7:30 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



Chndru:

Is it possible that Google knows what the biggest cities are and are rolling the filters through them?

Has nothing to do with finding them out from web pages....

Kirby

7:38 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Big cities may also relate to higher cost of adwords. It would be easy to use adwords as the filter.

I'm not advocating an Adwords conspiracy, just suggesting a simple way for Google to prioritize.

vbjaeger

7:48 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



hehehe...actually searched for search engine, and the #1 site was Altavista. Google was 5. Maybe Google is ranking on the quality of the serps?

quotations

8:04 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



yes, but try Search Engines -oewirysldskfjldajhf
This 526 message thread spans 18 pages: 526