Forum Moderators: open

Message Too Old, No Replies

What The Early Research is Showing – Florida Update 2003

an analysis and aggregate of the current post-Florida update best practices

         

ryanallis1

9:14 am on Dec 3, 2003 (gmt 0)



I would welcome any comments and discussion on the following article (all URLs and specific keywords have been removed) that analyzes the current state of the Google update and suggests certain steps to take for both webmasters and Google...

Thank you,
Ryan Allis

On November 15, 2003, the SERPs (Search Engine Result Pages) in Google were dramatically altered. Although Google has been known to go through a reshuffling (appropriately named a Google Dance) every 2 months or so, this 'Dance' seems to be more like a drunken Mexican salsa that its usual conservative fox-trot.

Most likely, you will already know if your web site has been affected. You may have seen a significant drop-off in traffic around Nov. 15. Three of my sites have been hit. While one could understand dropping down a few positions, since November 15, the sites that previously held these rankings are nowhere to be found in the top 10,000 rankings. Such radical repositionings have left many mom-and-pop and small businesses devastated and out of luck for the holiday season. With Google controlling approximately 85% of Internet searches, many businesses are finding a need to lay off workers or rapidly cancel inventory orders. This situation deserves a closer look.

What the Early Research is Showing

From what early research shows, it seems that Google has put into place what has been quickly termed in the industry as an 'Over Optimization Penalty' (OOP) that takes into account the incoming link text and the on-site keyword frequency. If too many sites that link to your site use link text containing a word that is repeated more than a certain number of times on your home page, that page will be assessed the penalty and either demoted to oblivion or removed entirely from the rankings. In a sense Google is penalizing sites for being optimized for the search engines--without any forewarning of a change in policy.

Here is what else we know:

- The OOP is keyword specific, not site specific. Google has selected only certain keywords to apply the OOP for.

- Certain highly competitive keywords have lost many of the listings.

How to Know if Your Site Has Been Penalized

There are a few ways to know if your site has been penalized. The first, mentioned earlier, is if you noticed a significant drop in traffic around the 15th of November you've likely been hit. Here are ways to be sure:

1. Go to google.com. Type in any search term you recall being well-ranked for. See you site logs to see which terms you received search engine traffic from. If your site is nowhere to be found it's likely been penalized.

2. Type in the search term you suspect being penalized for, followed by "-dkjsahfdsaf" (or any other similar gibberish, without the quotes). This will remove the OOP and you should see what your results should be.

3. Or, simply go to www.**** to have this automated for you. Just type in the search term and see quickly what the search engine results would be if the OOP was not in effect. This site, put up less than a week ago, has quickly gained in popularity, becoming one of the 5000 most visited web sites on the Internet in a matter of days.

The Basics of SEO Redefined. Should One De-Optimize?

Search engine optimization consultants such as myself have known for years that the basics of SEO are:

- put your target keyword or keyphrase in your title, meta-tags, and alt-tags
- put your target keyword or keyphrase in an H1 tag near the top of your page
- repeat your keyword or keyphrase 5-10 times throughout the page
- create quality content on your site and update it regularly
- use a site map (linked to from every page) that links to all of your pages
- build lots of relevant links to your site
- ensure that your target keyword or keyphrase is in the link text of your incoming links

Now, however, the best practices for keyword frequency and link text will likely trigger the Google OOP. There is surely no denying that there are many low quality sites have used link farms and spammed blog comments in order to increase their PageRank (Google's measure of site quality) and link popularity. However, a differentiation must be made from these sites and quality sites with dozens or hundreds of pages of informational well-written content that have taken the time to properly build links.

So if you have been affected, what can you do? Should one de-optimize their site, or wait it out? Should one create one site for Google and one for the 'normal engines?' Is this a case of a filter been turned on too tight that Google will fix in a matter of days or something much more?

These are all serious questions that no one seems to have answers to. At this point we recommend making the following changes to your site if, and only if, your rankings seem to have been affected:

1. Contact a few of your link partners via email. Ask them to change the link text so that the keyword you have been penalized for is not in the link text or the keyphrase is in a different order than the order you are penalized for.

2. Open up the page that has been penalized (usually your home page) and reduce the number of times that you have the keyword on your site. Keep the number under 5 times for every 100 words you have on your page.

3. If you are targeting a keyphrase (a multiple-word keyword) reduce the number of times that your page has the target keyphrase in the exact order you are targeting. Mix up the order. For example, if you are targeting "Florida web designer" change this text on your site to "web site designer in florida" and "florida-based web site design services."

It is important to note that these 'de-optimization' steps should only be taken if you know that you have been affected by the Google OOP.

Why did Google do this? There are two possible answers. First, it is possible that Google has simply made an honest (yet very poor) attempt at removing many of the low-quality web sites in their results that had little quality content and received their positions from link farms and spamdexing. The evidence and the search engine results point to another potential answer.

A second theory, which has gained credence in the past days within the industry, is that in preparation for its Initial Public Offering (possibly this Spring), Google has developed a way to increase its revenue. How? By removing many of the sites that are optimized for the search engines on major commerical search terms, thereby increasing the use of its AdWords paid search results (cost-per-click) system. Is this the case? Maybe, maybe not.

Perhaps both of these reasons came into play. Perhaps Google execs thought they could

1) improve the quality of their rankings,
2) remove many of the 'spammy' low-quality sites
3) because of #2, increase AdWords revenues and
4) because of better results and more revenue have a better chance at a successful IPO.

Sadly, for Google, this plan had a detrimental flaw.

What Google Should Do

While there are positives that have come from this OOP filter, the filter needs to be adjusted. Here is what Google should do:

1. Post a communiqué on its web site explaining in as much detail as they are able what they have done and what they are doing to fix it;

2. Reduce the weight of OOP;

3. If the OOP is indeed a static penalty that can only be removed by a human, change it to a dynamic penalty that is analyzed and assessed with each major update; and

4. Establish an appeal process through which site owners which feel they are following all rules and have quality content can have a human (or enlightened spider) review their site and remove the OOP if appropriate.

When this recent update broke on November 15, webmasters clamored in the thousands to the industry forums such as webmasterworld.com. The mis-update was quickly titled "Florida Update 2003" and the initial common wisdom was that Google had made a serious mistake that would be fixed within 3-4 days and everyone should just stay put and wait for Google to 'fix itself.' While the rankings are still dancing, this fix has yet to come. High quality sites with lots of good content that have done everything right are being severely penalized.

If Google does not act quickly, it will soon lose market share and its reputation as the provider of the best search results. With Yahoo's recent acquisition of Inktomi, Alltheweb/FAST, and Altavista, it most likely will soon renege on its deal to serve Google results and may, in the process, create the future "best search engine on the 'net." Google, for now, has gone bananas in its recent meringue, and it may soon be spoiled rotten.

Kirby

3:18 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>rfgdxm1, Let 'em complain for all they want to. But, they gotta understand that it's time to deal with it rather than venting empty threats/accusations and conspiracies at G.

Not complaining, just trying to find out what is happening. The challenge is that there seems to be so many discrepancies and contradictions.

For example, the search for 'big city real estate' returns only one of the pre-florida results still in top 10, with the rest primarily directory/authority sites. The next highest pre-florida result is #69, and a total of 88 of the pre-florida top 100 are gone.

Do a search for 'small city real estate' and results are back to pre-florida, with none of the directory/authority sites.

At first I attributed the number of directory and/or authority sites now showing for 'big city real estate' to the fact that as a big city it would have far more applicable directory/authority sites.

This theory gets blown out of the water when you do a search for 'big city2 real estate'. The results are pretty much the same as pre-florida, sans the directories.

After examing numerous city results, I'm looking for reasons that explain why this happens. Many of the sites that disappeared are real estate template sites, yet these same company template sites are untouched for other 'city real estate' searches.

I understand the noise level is always higher after updates, but if we could stick to analysis and problem solving instead of the 'serps are great, stop complaining' mantra, everyone would benefit.

MetropolisRobot

3:19 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



Yeah i'm slightly confused too. To me:

foo bar
and
foo bar -dwjkcbhhcf

should come up with the same results if some arcane operation / filter is not at work. I have checked the top 20 sites in the category I target and none of them have dwjkcbhhcf in their content but they still show up under case II. The difference is that I dont show up under case I.

John_Creed

3:21 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



Personally I like the new Google. I have the same legitimate complaints about Google that I had before this current update, but no new complaints.

None of my sites were adversely affected by the Florida update. I've even improved on a few terms

For years/months/whatever, many of the webmasters here would complain and whine to Google about all the spam sites that were beating them in the serps, and in most of those circumstances they were claiming spam just because a site wasn't totally to their liking and was ahead of them.

When Google started that asinine practice of auto-banning sites just for the color of the text being the same as the background(instead of ignoring the text as to not risk banning good sites), these same people cheered loudly and were happy they now had a little less competition and couldn't care less about the quality of the serps.

Well, now the shoe is on the other foot. (I'm _not_ referring to all of you and it sucks that your sites are gone, but the whiners in question know who they are and are getting what they deserve.)

Don't you wish you complained less about spam and instead focused on working on your own content; Don't you wish that you had advocated for a more tolerant Google instead of a strict Google that tosses sites at a whim?

That "relevant" site you own is probably considered spam by someone else. Maybe the quality of the serps are better without your sites?

Have a happy Christmas.

jchance

3:24 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



MetropolisRobot,

I think the consensus is that doing a search like
foo bar -dwjkcbhhcf
is using the Pre-Florida algorithm. Thats why you show up for that search but not the other.

merlin30

3:28 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



Superscript,

I think that claus asserts that if you use - operator to qualify your search you are also implying an exact match on your first term so the results are more like an exact match of the search rather than a broad match.

To test this I have done many searches using the exclusion operator and then a search using quotes (without the exclusion operator). Although in most cases the two results aren't exactly alike, they are pretty close.

I'm sure claus will correct me if I've put words into his mouth!

[edited by: merlin30 at 3:32 pm (utc) on Dec. 4, 2003]

MetropolisRobot

3:28 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



ah jchance, a good point. I'll get my first cup o joe.

superscript

3:38 pm on Dec 4, 2003 (gmt 0)



Merlin30

I think that claus asserts that if you use - operator to qualify your search you are also implying an exact match on your first term so the results are more like an exact match of the search rather than a broad match.

A very reasonable explanation, but the question remains: why are some terms broad matched and others not.

Please Be Gentle

3:56 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



Claus: "to get more sensible results, the user should enter "shelving -waffle"
One more jumped on the bandwaggon. Shelving is just a word like any other word - one that can be used in a lot of combinations with a lot of intentions - if you want an exact search, then do an exact search. Please let's not go there again - it's dead, gone, buried. "

Chndru:"It's time to deal with it rather than venting empty threats/accusations and conspiracies at G".
Just to clarify - I wasn't venting conspiracies at G, I did point out that as I haven't done many searches recently, I couldn't say whether Google was working properly or not. I merely highlighted the programme, and tried to repeat as much of it as I could remember (I didn't realize it would be available on video), to make another point. I don't necessarily agree with what they say but it is only the second time I have ever seen Google mentioned on tv in Europe (the first time was in September. Canal Sur News in the South of Spain mentioned Google, as the local Government had appeared in the top 10 queries in August - not a particularly interesting item if you ask me!). Anyway, it is the perception of Google rather than how it is actually performing that was the point- sometimes "to be is to be perceived". The programme did mention the Advanced Search page and using search commands to refine the searches.(Personally I don't find the Advanced Search Page particularly easy to use or intuitive, and sometimes I sense a bit of ambivalence from Google with regard to refining queries. I know that they give tips, but in [google.com...] they say
"DON'T bother with advanced search techniques, such as +,-,quotes, etc. unless the most obvious keywords don't work" which to me would suggest that they want people to try unrefined queries first, and then if they fail, use operators, rather than using operators from the start). I just thought it was significant that a mainstream (not a specifically technical) programme should devote so much time to a Google update- even if there isn't a problem they made it sound as if one existed.
I hope that clarifies things.
With Kind Regards
PBG

MetropolisRobot

4:00 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



This kind of misses the point slightly. The vast majority of google users and hence search "writers" are not going to use the modifiers. They use a combination of words and move on from there. They are no more likely to search for

foo bar -****xxx

as the next person. That facility is provided for those of us who really want to do some serious googling.

The vast majority of searches are of the form

foo bar

and then people cycle through the first few pages looking for what they want. Therefore, if yoiur results appear in response to a simple search you are made, else, for an ecommerce site who relies on casual searchers, you are sunk.

Lovejoy

4:04 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



From my own experince it appears that google has gone
for more specific search terms, three words plus.
I have two sites that are very similar in structure,
lots of content, keyword in url and title and link
only to related sites specific to the keywords.
One site is number one for it's two word search term
"blue widgets" and "blue widget", the other site is no
where to be found using the same keywords, but is all
over the first two pages for three word plus searches
for "blue widgets greendots" or "new blue widgets".
The only problem as we all know, is that most of our
traffic comes from two word search terms, surfers are
going to have to learn to use much more specific terms
if they want good results.

Hissingsid

4:06 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Kirby recently said,

At first I attributed the number of directory and/or authority sites now showing for 'big city real estate' to the fact that as a big city it would have far more applicable directory/authority sites.

This theory gets blown out of the water when you do a search for 'big city2 real estate'. The results are pretty much the same as pre-florida, sans the directories.

Hi Kirby,

It is examples of conflicting results like this together with the vast range of on/off page criteria found by webmasters here that convinces me that a spam filter is being applied to certain keywords and not to others.

I'm proposing the theory that this filter works on the result set for a given search. Any sites that are deemed to be OK pass through the filter and are weighted by the standard Google algo, any that are not OK are stopped by the filter and no algo is applied to them.

When you think about it this is an efficient way of penalising sites. By not giving them any weight for any of the normal SEO techniques they drop out of the reults and you save on processing power becuase they can just be disregarded.

If you do the search using the term that results in your site being dropped out of SERPs using one of these filters does it come back in?

intitle:
allintitle:
allintext:
allinanchor:
search term -site:www.google.com

Sid

superscript

4:12 pm on Dec 4, 2003 (gmt 0)



The BBC report is now online, with the video of the programme. Go to the main BBC site, then the Technology section. There are a couple of errors regarding Google Dances still being in effect, but its interesting to see journalists' take on it.

BallochBD

4:23 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



I joined the forum yesterday and set a flag on this thread. I now want to stop being notified by email and I have reset my flags but I am still receiving notifications. Can any one tell me how to turn this off? Please.

c1bernaught

4:50 pm on Dec 4, 2003 (gmt 0)

10+ Year Member




I have been busy conducting searches just on the top 50 American cities. It is curious to see that some have serps that I have grown to expect and others seem to have serps that are completely outside of my expectation.

This leaves me wondering why "Chicago" would be treated differently than "Detroit" or "Houston"?

Also, why does "city_name term" and "term in city_name" return results that are so different.

vbjaeger

4:52 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



Chicago could be referring to movie/musical, or anything else that uses chicago.

satanclaus

5:04 pm on Dec 4, 2003 (gmt 0)



I didn't read the entire thread. The initial posts, a few pages past that, the last. Too much to sort out on the board now days.

Seems to me that over optimization is exactly what sank people. Being over analytical makes you jaded towards what constitutes a good site.
We definitely analyze the hell out of things now days at WebmasterWorld.(as evident by this thread)

ronhollin

5:26 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



How in the heck does one unsubscribe to a thread?

claus

5:30 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>> explain the strange effects of using a meaningless subtractive term on the SERPs?

yeah... i really thought i did answer that ;) The key is in the literal interpretation of that first sentense of the quote:

You can increase the accuracy of your searches by adding operators that fine-tune your keywords

...adding a space and a minus sign is "adding operators", no matter what you write after that minus sign. As a result of this, the search box reverts from the "broad match query" (post-Florida default) to the "specific match query" (pre-Florida default).

Pre-Florida Google did what any other SE does - tried to find an exact match. Post-Florida, Google does something that other SE's just can't do, that's probably why it's so hard to understand.

>> Why Chicago and not Houston... / why some, not others...

The easy answer would be that Chicago starts with C and Houston starts with H and the "broad match feature" haven't reached H yet.

I'm not sure it's the right answer, but i'm sure that the reason why some search terms are affected more than others is the straightforward one: Because the "broad match database" (for lack of better words) haven't been expanded to all words yet.

>> google has gone for more specific search terms, three words plus

I've seen a move away from the broad match, toward the exact match (pre-Florida style) as well the last couple of days.

I'm sure post-Florida will return full scale, as that is something other SE's just can't do. Remember the rumors of MS going into search? this is, if nothing else, a demonstration of superior technology.

/claus

superscript

5:37 pm on Dec 4, 2003 (gmt 0)



The easy answer would be that Chicago starts with C and Houston starts with H and the "broad match feature" haven't reached H yet.

Now if that isn't pure speculation - I'd like to know what is ;)

How are Alaskan and Zairen real estate terms faring? :)

Trawler

6:14 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



Well here goes. Along these same lines.

I was hoping to provide more information along with this post but I cannot at this time. I am providing this in the hope the more experienced SEO's can run with it.

These are the example keywords, I have not tested these but if you replace the example with your keywords it may supprise you.

Keyword 1 reno
Keyword 2 hotels

Search reno hotels - below the basment level.

Search re-no hotels - bingo back to expected position.

Search reno hotel-s - bingo back to expected position.

In reality by actually making these changes to the pages, in an organized fashion, I would thing the filter can be tricked.

The smart way to go about this would be to totaly deoptimise a page and see if it will rank using the hyphen soley in the title and one occurance on the page.

If so then continue with the SEO to determine exactly where the filter trips.

I am doing this now but the time factor is two to three days per change to see the results.

Kirby

6:15 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



claus, thanks for the ongoing analysis. Your approach to this always appreciated.

I looked at several cities:
Denver - more directory/authority sites
San Diego - more directory/authority sites
Las Vegas - 5 of 1st 6 are pre-florida, non-directory sites
Boston - more directory/authority sites
Orange County - mix of pre-florida and directory/authority sites
Naples, Florida - mostly directory sites w/ listings of pre-florida sites

Before the critiques of the serps begin, I'm not expressing an opinion on the quality, just trying to understand the results since a weather site with a real estate link comes up a lot. Also seeing other pages comes up simply because of a link to a page one might expect to get as a relevant result. I'm looking for a way to reconcile theming, broad match and these oddball pages.

Any thoughts?

caveman

6:17 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



...superior technology...not yet ready for prime time.

I don't care much for technology except technology that improves my life. Don't see *that* anywhere around here. Not yet anyway.

Sure do love my Blackberry though. Now that's cool technology. OK, so I'm simple minded. ;-)

< 'blackberry' ... now that's an interesting search term. (Mods if I wan't supposed to say that word, please delete.)>

KayENT

6:41 pm on Dec 4, 2003 (gmt 0)



Did anybody read the Article in the Atlanta Journal and Constition on Google. It talks about how Google really hurt the small business online and brought them to their knees at a time when they really rely on internet traffic.

They also talk about the possibility of Google trying to get ad revenueas the cause of this algo change.

Nothing that hasn't been discussed here on the forums but it appears that the press is writing about it. I wouldn't want this kind of press if I were Google.

The title of the Article is:

Google Changes Rankle Merchants-
Thursday Decemeber 4th Atlanta Journal And Constitution

I have no opinion either way but wanted to post this for everybody to see.

c1bernaught

6:43 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



Kirby:

I guess that's why I'm confused about Siterank. It seems that a weather site should not be an authoritative site about Chicago. Perhaps it's an authoritative site about weather in chicago, in which case I would expect it to be on the first page for a search on Chicago weather or weather in Chicago.

It seems if I were to search for "city_name mortgage" I would see banks and other large lendors as authoritative, but that's not the case.

The Alphabet theory doesn't really work either as Searches for "Boston term" and "term in Boston" still return what I would expect to see.

Also there seems to be a city size issue. The largest cities seem to be affected while the smaller cities seem to return what I would expect to see.

Also, I definately see a difference in KW usage in the new terms showing up in the big cities. KW density is far lower, so low as to be almost a non issue. Whatever is floating these guys to the top in big cities has little to do with KW relevance.

I'm also seeing that many of these sites have incoming links from just as many non-industry or non-topic related sites as they do same industry or topic.

Powdork

6:54 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Here's another article from a non-mainstream source. It needs to be in the WSJ.
[isedb.com...]

c1bernaught

7:10 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



Powdork:

That really was good and well balanced story. It makes sense. I hope the writer is correct about Google.

Kirby

7:10 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sid wrote >If you do the search using the term that results in your site being dropped out of SERPs using one of these filters does it come back in?

intitle:
allintitle:
allintext:
allinanchor:
search term -site:www.google.com

The site comes back in, but few of the other pre-florida pages that are gone return with the exception of "search term -site:www.google.com". This returns the same results as 'search term - asdf-fdsa".

Allintitle, allintext and allinanchor results drastically different than pre-florida. Why would that be?

Powdork

7:19 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Okay, here are two searches that will give some information on several ideas running about.
tahoe wedding packages
lake tahoe wedding packages

1. The fact that 'tahoe wedding' trips the filter and 'tahoe wedding packages' does not, yet 'lake tahoe wedding packages' does would lead one to believe that having two of three poison words in a search will not trip the filter, but three out of four will.
2. There is no Google directory listed at the top of the 'lake tahoe wedding packages' search, but it is tripping the filter.
3. There is only one adword for the search that doesn't trip the filter, there are many for the rest.
4. Heavy cross linking is the way to go.

joebra

7:21 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



Would it be possible for some of the great minds here to view my two real estate sites to see where I went wrong in this recent drop from Google? I am not sure we an post sites here, but let me know. Thanks!

Trawler

7:26 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



Powdork>

----

Try a search for tahoe weddin-g or ta-hoe wedding

If some of you will look at the affects of the hyphen, I am sure one of us will stumble upon something.

This 526 message thread spans 18 pages: 526