Forum Moderators: open

Message Too Old, No Replies

What The Early Research is Showing – Florida Update 2003

an analysis and aggregate of the current post-Florida update best practices

         

ryanallis1

9:14 am on Dec 3, 2003 (gmt 0)



I would welcome any comments and discussion on the following article (all URLs and specific keywords have been removed) that analyzes the current state of the Google update and suggests certain steps to take for both webmasters and Google...

Thank you,
Ryan Allis

On November 15, 2003, the SERPs (Search Engine Result Pages) in Google were dramatically altered. Although Google has been known to go through a reshuffling (appropriately named a Google Dance) every 2 months or so, this 'Dance' seems to be more like a drunken Mexican salsa that its usual conservative fox-trot.

Most likely, you will already know if your web site has been affected. You may have seen a significant drop-off in traffic around Nov. 15. Three of my sites have been hit. While one could understand dropping down a few positions, since November 15, the sites that previously held these rankings are nowhere to be found in the top 10,000 rankings. Such radical repositionings have left many mom-and-pop and small businesses devastated and out of luck for the holiday season. With Google controlling approximately 85% of Internet searches, many businesses are finding a need to lay off workers or rapidly cancel inventory orders. This situation deserves a closer look.

What the Early Research is Showing

From what early research shows, it seems that Google has put into place what has been quickly termed in the industry as an 'Over Optimization Penalty' (OOP) that takes into account the incoming link text and the on-site keyword frequency. If too many sites that link to your site use link text containing a word that is repeated more than a certain number of times on your home page, that page will be assessed the penalty and either demoted to oblivion or removed entirely from the rankings. In a sense Google is penalizing sites for being optimized for the search engines--without any forewarning of a change in policy.

Here is what else we know:

- The OOP is keyword specific, not site specific. Google has selected only certain keywords to apply the OOP for.

- Certain highly competitive keywords have lost many of the listings.

How to Know if Your Site Has Been Penalized

There are a few ways to know if your site has been penalized. The first, mentioned earlier, is if you noticed a significant drop in traffic around the 15th of November you've likely been hit. Here are ways to be sure:

1. Go to google.com. Type in any search term you recall being well-ranked for. See you site logs to see which terms you received search engine traffic from. If your site is nowhere to be found it's likely been penalized.

2. Type in the search term you suspect being penalized for, followed by "-dkjsahfdsaf" (or any other similar gibberish, without the quotes). This will remove the OOP and you should see what your results should be.

3. Or, simply go to www.**** to have this automated for you. Just type in the search term and see quickly what the search engine results would be if the OOP was not in effect. This site, put up less than a week ago, has quickly gained in popularity, becoming one of the 5000 most visited web sites on the Internet in a matter of days.

The Basics of SEO Redefined. Should One De-Optimize?

Search engine optimization consultants such as myself have known for years that the basics of SEO are:

- put your target keyword or keyphrase in your title, meta-tags, and alt-tags
- put your target keyword or keyphrase in an H1 tag near the top of your page
- repeat your keyword or keyphrase 5-10 times throughout the page
- create quality content on your site and update it regularly
- use a site map (linked to from every page) that links to all of your pages
- build lots of relevant links to your site
- ensure that your target keyword or keyphrase is in the link text of your incoming links

Now, however, the best practices for keyword frequency and link text will likely trigger the Google OOP. There is surely no denying that there are many low quality sites have used link farms and spammed blog comments in order to increase their PageRank (Google's measure of site quality) and link popularity. However, a differentiation must be made from these sites and quality sites with dozens or hundreds of pages of informational well-written content that have taken the time to properly build links.

So if you have been affected, what can you do? Should one de-optimize their site, or wait it out? Should one create one site for Google and one for the 'normal engines?' Is this a case of a filter been turned on too tight that Google will fix in a matter of days or something much more?

These are all serious questions that no one seems to have answers to. At this point we recommend making the following changes to your site if, and only if, your rankings seem to have been affected:

1. Contact a few of your link partners via email. Ask them to change the link text so that the keyword you have been penalized for is not in the link text or the keyphrase is in a different order than the order you are penalized for.

2. Open up the page that has been penalized (usually your home page) and reduce the number of times that you have the keyword on your site. Keep the number under 5 times for every 100 words you have on your page.

3. If you are targeting a keyphrase (a multiple-word keyword) reduce the number of times that your page has the target keyphrase in the exact order you are targeting. Mix up the order. For example, if you are targeting "Florida web designer" change this text on your site to "web site designer in florida" and "florida-based web site design services."

It is important to note that these 'de-optimization' steps should only be taken if you know that you have been affected by the Google OOP.

Why did Google do this? There are two possible answers. First, it is possible that Google has simply made an honest (yet very poor) attempt at removing many of the low-quality web sites in their results that had little quality content and received their positions from link farms and spamdexing. The evidence and the search engine results point to another potential answer.

A second theory, which has gained credence in the past days within the industry, is that in preparation for its Initial Public Offering (possibly this Spring), Google has developed a way to increase its revenue. How? By removing many of the sites that are optimized for the search engines on major commerical search terms, thereby increasing the use of its AdWords paid search results (cost-per-click) system. Is this the case? Maybe, maybe not.

Perhaps both of these reasons came into play. Perhaps Google execs thought they could

1) improve the quality of their rankings,
2) remove many of the 'spammy' low-quality sites
3) because of #2, increase AdWords revenues and
4) because of better results and more revenue have a better chance at a successful IPO.

Sadly, for Google, this plan had a detrimental flaw.

What Google Should Do

While there are positives that have come from this OOP filter, the filter needs to be adjusted. Here is what Google should do:

1. Post a communiqué on its web site explaining in as much detail as they are able what they have done and what they are doing to fix it;

2. Reduce the weight of OOP;

3. If the OOP is indeed a static penalty that can only be removed by a human, change it to a dynamic penalty that is analyzed and assessed with each major update; and

4. Establish an appeal process through which site owners which feel they are following all rules and have quality content can have a human (or enlightened spider) review their site and remove the OOP if appropriate.

When this recent update broke on November 15, webmasters clamored in the thousands to the industry forums such as webmasterworld.com. The mis-update was quickly titled "Florida Update 2003" and the initial common wisdom was that Google had made a serious mistake that would be fixed within 3-4 days and everyone should just stay put and wait for Google to 'fix itself.' While the rankings are still dancing, this fix has yet to come. High quality sites with lots of good content that have done everything right are being severely penalized.

If Google does not act quickly, it will soon lose market share and its reputation as the provider of the best search results. With Yahoo's recent acquisition of Inktomi, Alltheweb/FAST, and Altavista, it most likely will soon renege on its deal to serve Google results and may, in the process, create the future "best search engine on the 'net." Google, for now, has gone bananas in its recent meringue, and it may soon be spoiled rotten.

guynouk

7:31 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



vbjaeger - thanks for the reply.
I probably didn't make my self clear. What I really wanted to know was the exact link anchor text (if any) of the sites that survived. As cross-linking and anchor text now seem to have similar thresholds - too much of the same anchor text/'c' class links and your site is disadvantaged, then if you could confirm the anchor text links for the surviving sites I'd like to know.

vbjaeger

7:34 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



guynouk,

The anchor text is nothing more than the company, no keywords at all.

One interesting fact though, is that they are not listed on the so-called authority sites that have replaced the company listings.

I think it has to do more with the type of link you get.

[edited by: vbjaeger at 7:36 pm (utc) on Dec. 3, 2003]

coconutz

7:35 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>The reason it did good before, was the anchor text, the reason it is nowhere now, is the anchor text either got it banned or is discounted totally.

I don't think the site was banned for a specific term, but from looking at what was posted regarding on page factors it looks like the page is no longer considered relevant to the anchor text.

Do you think this page should rank well for this term?

guynouk

7:36 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



vbjaeger
One more question - I'm not sure if you can find this out but were the sites that were penalised reciprocal linking or were these links one way?

Newman

7:36 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



Change to Google ranking system irks merchants
By Lisa Baertlein

[forbes.com...]

A part of the article:

Wayne Rosing, vice president of engineering at Google, said the change is part of the Silicon Valley-based company's efforts to provide high-quality search results.
"This particular change affected more people, but our testing shows there was a significant quality improvement for our users," said Rosing.
Andrew Goodman, principal at search advertising consulting firm Page Zero Media, said the volume of the complaints has risen in the past week and that most appear to come from people who use or teach various techniques, such as repeating certain words on their Web pages, to boost rankings.
"It's part of an attempt to weed out sites that rank higher than they should," said Goodman, who added that Google has long played a game of "cops and robbers" with Web site operators and consultants looking for an unfair edge on competitors.
Rosing said that while it is impossible to design a perfect fix that works in all cases, Google has found a balance with its latest effort.
"I wouldn't say they botched it, I'd say it's affected a lot of good businesses as well as a lot of less reputable ones," Goodman said.

...Rosing said there is no truth in such charges.
Google's advertising and search businesses "are completely separated, there is no linkage between the two," Rosing said.

So it looks like everything's OK? :-(

vbjaeger

7:45 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



One more question - I'm not sure if you can find this out but were the sites that were penalised reciprocal linking or were these links one way?

Most of the The filtered sites do have reciprocal links with the directories. None however link to each other. The sites that made it through the filter have FAR less reciprocal links.

squared

7:47 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



There wasn't a "significant quality improvement" in my main keyword searches. I have pages that haven't been changed since 1997 ranking above my relevant, current site for its main keyword. There are also noticeably more links above me with pages that say this page has moved.

-squared

vbjaeger

7:47 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



"This particular change affected more people, but our testing shows there was a significant quality improvement for our users," said Rosing.

Employees and lab monkeys dont count!

guynouk

7:48 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



vbjaeger - thanks again
So from what you're saying sites that have been keyword1-keyword2 disadvantaged post Florida had a disproportionately high number of anchor text links for keyword1-keyword2, whereas the sites that survived although having some keyword1-keyword2 inbound anchor text links also had other/enough non keyword links to avoid being disadvantaged. Also, that these disadvantaged sites with the inbound keyword1-keyword2 anchor links had reciprocated the link and I guess could therefore be portrayed as having contrived the link as opposed to being placed by a competitor.

If this is the case then I suppose to reverse the disadvantage more links with varied/no anchor text would be necessary unless the dates of the anchor links are taken into consideration which would be a bit unfair as you'd need to scrap the site.

[edited by: guynouk at 8:08 pm (utc) on Dec. 3, 2003]

DRGather

7:51 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



Mental Note-

"when confronted with massive business fubar's immediately preceding an IPO, deny all points and report only the things that make you sound good, counting failures as successes"

~tucks pen back into pocket protector and quickly stashes little spiral notebook labeled *TOP SECRET KILLER BUSINESS NOTES*~

sorry, couldn't resist. Mods, go ahead and delete. LOL

jim_w

7:56 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>Rosing said there is no truth in such charges<<

So did Enron’s CEO

anime_otaku

8:10 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



*sniff*
smells like an anti google bomb algo in the works. anchor text inbounds/externals may be to blame for sites to drop out, no matter the kw density of the page in question.

of course this is only applied to 'adword' phraises, no less. nothing but money leeching at the last inopportune time before yahoo kicks in with inktomi.

[edited by: anime_otaku at 8:12 pm (utc) on Dec. 3, 2003]

davidpbrown

8:10 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



I've only read a fraction of the posts on Florida, so apologies if this is wide of the mark but I'm surprised there's so little mention of stemming.. I thought that it was only introduced recently.

If you have a load of sites optomising to specific keywords because your engine is good enough to give exact results, and then you impliment an amount of stemming, wouldn't it be mostly the artificially SEO'd sites that would 'suffer'/become more appropriately placed?

?

Jimbo

8:12 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



Although Google is said to deny it, it seems obvious to me that this whole situation is a simple attempt on their part to force commercial web site owners to buy into Adwords.

Imagine you were an Adwords advertiser and your ad box is stuck over on the right, not all that noticeable compared to your competitors' optimised sites in the main list. You'd complain to Google saying "How come I'm paying for an advert on your site when you are listing sites for free, in a better page location?". You might tell them to sort it out otherwise you'll take your ad budget to Overture.
Google might respond "OK - we'll get rid of your competitors. We know what are the most popular words to zap because advertisers are buying them on the Adwords program, so those are the ones we'll dump.". They then replace the top rankings with 'authority' sites or shopping directories. They agree to keep directories on the results because Google's advertisers may also be paying to appear on them and they would not want to offend their advertisers.

Thus, Google pleases its advertisers, it encourages people to buy into Adwords, and it can tell the world that if you want to find sites selling widgets that if there aren't any in the Adwords section at the moment, they can be pretty sure that there will be some there very soon. They can say "why should we give people free advertising when people are willing to pay for it?".

So folks, this is the situation that we are facing. It is the end of free rankings on Google (ones that you can make money from anyway). Unless you have factors on or about your site that allows you through the net, and into the high ranks, you had better start thinking about what you're going to do.

Google might consider itself immune to adverse consequenses of Florida e.g. ending up the way that Altavista went a few years back. The only risk they seem to be running in my eyes is the possibility that thousands of aggrieved site owners start clicking away on the Adwords of their competitors to waste their money thus rendering Adwords unsustainable. They will have protection in place to try to stop this happening. They will want to cut down on the risks of this ever starting hence their denials that their real motives are to force people to run Adwords. They are currently hiding behind so-called honorable motives of improving results when IMHO it's a cynical tactic of making money at the expense of genuine businesses and the jobs of their employees. They don't even have the guts to admit why they have done it.

As a contributor said above, it's time to use a different search engine. IMO Google's latest strategy demonstrates that it has the morals of cheap chisselers. I for one do not intend to pay a penny into Adwords, given the devastating results that Florida has had for me and my clients.

Is anyone willing to reward that sort of behavior? I hope not.
Jim

More Traffic Please

8:15 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



One simple thought I've had is that Google has either implemented or increased the value of a "site" PR as compared to just a page PR. IOW, a PR2 page that has one occurrence of a KW phrase on it within the site of cnn.com, (a PR9) site, will carry much more weight in the SERPS than an optimized PR5 index page on a different site. The idea of a seperate SiteRank variable in addition to PageRank would go a long way in explaining the tons of directory and news sites showing up in searches. I'm sure there are other changes in the algo, but I'm looking at this idea real hard.

MetropolisRobot

8:16 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



Love to believe it, but adword spending has no impact on placement of several sites banished by by Florida. Unless there's some wait until the next indexing, but to me, there's definitely a filter in.

keyword keyword, i'm nowhere
keyword keyword -foobar, i'm right back where I was

superscript

8:19 pm on Dec 3, 2003 (gmt 0)



davidpbrown

Perhaps stemming deserves more consideration. There is a simple logic to it I guess. If there are 100,000 sites optimised for 'widget', and suddenly the KW 'widget' now includes all those sites optimised for 'widgeting' - and there are 100,000 of those, now you have twice as many competitors. But the number of sites in the first hundred are fixed (obviously!). Ergo, sites face more competition and drop.

The question is though, how many variations on the hypothetical stem 'widget' are now in place? - are there sufficient versions to explain the massive drop in top ranking sites we've observed?

davidpbrown

8:23 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



plasma suggests in another thread [webmasterworld.com...] that

It's relatively new.
Since a short time (< 1 month?) Google now uses stemming to return better results.

DRGather

8:25 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



Take this thinking a step further (as I stated elsewhere)...

If they're using stemming and there is a "mystery spam threshold" for keyword density within pages OR within specific page elements.... does the new "mega-KW" (meaning all variations of the stemmed root) count CUMULATIVELY towards this limit?

Example:

I load my page with:

widgets, widgeting and widgeter

... and life is merry, I have the perfect weight/density for all my KW's and I'm at the top of the SERPS.

... fast forward ...

Now because of stemming all of these are seem as the same KW. So what google sees is:

widget, widget, widget

... and bingo, you've "stuffed" your pages with VARIATIONS of the keyword in question so you're plummeting down the SERPS like Icarus.

That was my original thought anyway.

superscript

8:38 pm on Dec 3, 2003 (gmt 0)



DRGather

I'm just not convinced about this 'penaly' or 'semi-penalty' concept. What appears to be a penalty is probably, as it has always been in the past, the result of a change of emphasis. It's occurred to me that you don't actually need that many new stems for large numbers of sites to lose a great deal of position. I'll give it more thought and get back.

jim_w

8:41 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>lose a great deal of position<<

Well I lost at least 989 positions for 2 KW's. I stopped looking after the first 1000. That _is_ a great deal.

superscript

8:47 pm on Dec 3, 2003 (gmt 0)



jim_w

But it may take only a few new stems to cause such a big drop. But it's complex as it depends on the relative fraction of each stem being used on the Internet. e.g. references to Goose, and Geese are likely to be roughly equal in number, but the inclusion of the possible and ungrammatical stems Gooses and Geeses is unlikely to shake the SERPs at all. Some words have numerous stems: GG himself referred to Australia, e.g. Australia, Australia's, Australias (as in Cup), Australian - throw all these sites together into the SERPs, and a large proportion are going to plummet.

But...if you've used all these stems in your text, you might get a similar volume of visitors as before, perhaps on more precise searches, even though you don't appear to be ranking highly on a simple search any more. Is this what GG was getting at?...

[edited by: superscript at 8:51 pm (utc) on Dec. 3, 2003]

jim_w

8:50 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There are 404 URL's and splash pages in front of mine.

superscript

8:53 pm on Dec 3, 2003 (gmt 0)



Might be worth checking if there is any text in a typical 404 error message that could be mistaken as a stem for one of your KWs ;)

jim_w

8:57 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There is not. It is the typical UNIX Apachie 404 error page

MetropolisRobot

8:58 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



My experience is close to jim_w. Def looks like some either (a) error or (b) penalty.

BallochBD

9:00 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



regarding: davidpbrown - Perhaps stemming deserves more consideration.

I dropped from my valued number one position for my keyword which is actually a four letter acronym. I am now nowhere to be seen for this and my traffic has been reduced by 80%. Surely this would eliminate stemming as a possible cause? How do you stem an acronym?

Additionally, while my site is commercial it was founded on the provision of valuable, free information and I have received many plaudits for this. I offer free white papers on my subject as a consultant and I believe that my site provides a great free service (or perhaps that should be "provided" because this has just about put me out of business.

Regarding Adwords, I carried out a straw poll on several Google users about Adwords and all agreed that their eye goes immediately to the result at the top of the white space. The adwords at the right and at the top are subconsciously ignored. This is why I have never considered using them.

[edited by: BallochBD at 9:11 pm (utc) on Dec. 3, 2003]

[edited by: ciml at 5:30 pm (utc) on Dec. 5, 2003]
[edit reason] fixing email notification [/edit]

crankin

9:02 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



>If they're using stemming and there is a "mystery spam threshold" for keyword density within pages OR within specific page elements.... does the new "mega-KW" (meaning all variations of the stemmed root) count CUMULATIVELY towards this limit?

DRGather, I've been thinking this may be part of what torpedoed my site. I had optimized as organically and low-key as possible to a nice balance of 'widget, widgets, widgeting, widgeteers' to show up decently across a group of keywords and phrases. But if your theory is right, then Google is seeing all the variations as one KW, making it KW megaspamming coming from me, and rightfully zapping me to Siberia.

That sux.

MetropolisRobot

9:10 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



I just don't see the KW density issue as being solely the problem. Here I am looking at a competitor and they have a keyword density for the 2 word phrase of 38% on the home page and roughly 12 to 14% on all 20000+ sub pages....

jim_w

9:14 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>KW density issue <<

I agree. As I said somewhere, the 1st 2 positions have more KWs than me, I checked with software of course, and at about the 50th down they have less than me. I'm gone, but they are not, and for the life of me, I cannot figure out why other than I was making too much on adsense for the top pages do not run adsense at all.

This 526 message thread spans 18 pages: 526