Forum Moderators: open

Message Too Old, No Replies

What The Early Research is Showing – Florida Update 2003

an analysis and aggregate of the current post-Florida update best practices

         

ryanallis1

9:14 am on Dec 3, 2003 (gmt 0)



I would welcome any comments and discussion on the following article (all URLs and specific keywords have been removed) that analyzes the current state of the Google update and suggests certain steps to take for both webmasters and Google...

Thank you,
Ryan Allis

On November 15, 2003, the SERPs (Search Engine Result Pages) in Google were dramatically altered. Although Google has been known to go through a reshuffling (appropriately named a Google Dance) every 2 months or so, this 'Dance' seems to be more like a drunken Mexican salsa that its usual conservative fox-trot.

Most likely, you will already know if your web site has been affected. You may have seen a significant drop-off in traffic around Nov. 15. Three of my sites have been hit. While one could understand dropping down a few positions, since November 15, the sites that previously held these rankings are nowhere to be found in the top 10,000 rankings. Such radical repositionings have left many mom-and-pop and small businesses devastated and out of luck for the holiday season. With Google controlling approximately 85% of Internet searches, many businesses are finding a need to lay off workers or rapidly cancel inventory orders. This situation deserves a closer look.

What the Early Research is Showing

From what early research shows, it seems that Google has put into place what has been quickly termed in the industry as an 'Over Optimization Penalty' (OOP) that takes into account the incoming link text and the on-site keyword frequency. If too many sites that link to your site use link text containing a word that is repeated more than a certain number of times on your home page, that page will be assessed the penalty and either demoted to oblivion or removed entirely from the rankings. In a sense Google is penalizing sites for being optimized for the search engines--without any forewarning of a change in policy.

Here is what else we know:

- The OOP is keyword specific, not site specific. Google has selected only certain keywords to apply the OOP for.

- Certain highly competitive keywords have lost many of the listings.

How to Know if Your Site Has Been Penalized

There are a few ways to know if your site has been penalized. The first, mentioned earlier, is if you noticed a significant drop in traffic around the 15th of November you've likely been hit. Here are ways to be sure:

1. Go to google.com. Type in any search term you recall being well-ranked for. See you site logs to see which terms you received search engine traffic from. If your site is nowhere to be found it's likely been penalized.

2. Type in the search term you suspect being penalized for, followed by "-dkjsahfdsaf" (or any other similar gibberish, without the quotes). This will remove the OOP and you should see what your results should be.

3. Or, simply go to www.**** to have this automated for you. Just type in the search term and see quickly what the search engine results would be if the OOP was not in effect. This site, put up less than a week ago, has quickly gained in popularity, becoming one of the 5000 most visited web sites on the Internet in a matter of days.

The Basics of SEO Redefined. Should One De-Optimize?

Search engine optimization consultants such as myself have known for years that the basics of SEO are:

- put your target keyword or keyphrase in your title, meta-tags, and alt-tags
- put your target keyword or keyphrase in an H1 tag near the top of your page
- repeat your keyword or keyphrase 5-10 times throughout the page
- create quality content on your site and update it regularly
- use a site map (linked to from every page) that links to all of your pages
- build lots of relevant links to your site
- ensure that your target keyword or keyphrase is in the link text of your incoming links

Now, however, the best practices for keyword frequency and link text will likely trigger the Google OOP. There is surely no denying that there are many low quality sites have used link farms and spammed blog comments in order to increase their PageRank (Google's measure of site quality) and link popularity. However, a differentiation must be made from these sites and quality sites with dozens or hundreds of pages of informational well-written content that have taken the time to properly build links.

So if you have been affected, what can you do? Should one de-optimize their site, or wait it out? Should one create one site for Google and one for the 'normal engines?' Is this a case of a filter been turned on too tight that Google will fix in a matter of days or something much more?

These are all serious questions that no one seems to have answers to. At this point we recommend making the following changes to your site if, and only if, your rankings seem to have been affected:

1. Contact a few of your link partners via email. Ask them to change the link text so that the keyword you have been penalized for is not in the link text or the keyphrase is in a different order than the order you are penalized for.

2. Open up the page that has been penalized (usually your home page) and reduce the number of times that you have the keyword on your site. Keep the number under 5 times for every 100 words you have on your page.

3. If you are targeting a keyphrase (a multiple-word keyword) reduce the number of times that your page has the target keyphrase in the exact order you are targeting. Mix up the order. For example, if you are targeting "Florida web designer" change this text on your site to "web site designer in florida" and "florida-based web site design services."

It is important to note that these 'de-optimization' steps should only be taken if you know that you have been affected by the Google OOP.

Why did Google do this? There are two possible answers. First, it is possible that Google has simply made an honest (yet very poor) attempt at removing many of the low-quality web sites in their results that had little quality content and received their positions from link farms and spamdexing. The evidence and the search engine results point to another potential answer.

A second theory, which has gained credence in the past days within the industry, is that in preparation for its Initial Public Offering (possibly this Spring), Google has developed a way to increase its revenue. How? By removing many of the sites that are optimized for the search engines on major commerical search terms, thereby increasing the use of its AdWords paid search results (cost-per-click) system. Is this the case? Maybe, maybe not.

Perhaps both of these reasons came into play. Perhaps Google execs thought they could

1) improve the quality of their rankings,
2) remove many of the 'spammy' low-quality sites
3) because of #2, increase AdWords revenues and
4) because of better results and more revenue have a better chance at a successful IPO.

Sadly, for Google, this plan had a detrimental flaw.

What Google Should Do

While there are positives that have come from this OOP filter, the filter needs to be adjusted. Here is what Google should do:

1. Post a communiqué on its web site explaining in as much detail as they are able what they have done and what they are doing to fix it;

2. Reduce the weight of OOP;

3. If the OOP is indeed a static penalty that can only be removed by a human, change it to a dynamic penalty that is analyzed and assessed with each major update; and

4. Establish an appeal process through which site owners which feel they are following all rules and have quality content can have a human (or enlightened spider) review their site and remove the OOP if appropriate.

When this recent update broke on November 15, webmasters clamored in the thousands to the industry forums such as webmasterworld.com. The mis-update was quickly titled "Florida Update 2003" and the initial common wisdom was that Google had made a serious mistake that would be fixed within 3-4 days and everyone should just stay put and wait for Google to 'fix itself.' While the rankings are still dancing, this fix has yet to come. High quality sites with lots of good content that have done everything right are being severely penalized.

If Google does not act quickly, it will soon lose market share and its reputation as the provider of the best search results. With Yahoo's recent acquisition of Inktomi, Alltheweb/FAST, and Altavista, it most likely will soon renege on its deal to serve Google results and may, in the process, create the future "best search engine on the 'net." Google, for now, has gone bananas in its recent meringue, and it may soon be spoiled rotten.

jim_w

3:47 am on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Crisco
>>Simnple - because I am expected to enter the bidding war and buy adwords!<<

That’s what I keep coming back to, and if you made too much money with adsense with high ranked KWs, you got had as well.

claus

4:05 am on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



dazzlindonna, jim_w:
I've only tried to describe what's going on IMHO, AFAIK, etc. - i do think the SERPS are generally better now, but i really don't think everything is fine and well - in some cases it's not even good. Also, i believe that not all "keywords" or "sectors/themes/industries/languages/whatever" have been affected yet. More is to come, as this can't be the finished version (it's a very very good "public beta" though).

ronhollin, i hope it's not my post you are referring to as being a Google ad? If so, you're wrong.

/claus

[edited by: claus at 4:07 am (utc) on Dec. 4, 2003]

tim_m

4:06 am on Dec 4, 2003 (gmt 0)

10+ Year Member



I noticed a strange thing when using the search terms that were affected vs. the search words that weren't affected. Only on the search words that were affected do the directory categories show up at the top of the SERPS. And the SERPS of these affected key words are the same SERPS that I get when I go to the Google Directory and do a search for that term. What it seems to me is that for certain search words Google is trying to narrow the search down by trying to guess what Directory category is best and then just using the Google Dierctory category to gather results. At first I thought that only sites that were in the dmoz would come up but it seems that some sites are coming up that aren't in the dmoz. My site is not in the dmoz so I thought that might explain why it might totally disappear. So it might not be something as simple as using a Directory search but maybe a result relying very much on a Directory search.

I really don't think it has anything to do with incoming links being exactly the same. In fact, my observations indicate that exact links might help. I noticed that the top results for Atlanta Homes are magazine selling sites selling Atlanta Homes Magazine. I looked at the backlinks and found all of them with link text of exactly "Atlanta Homes" over and over. To me this would indicate that having a lot of links with the precise key word phrase it is not a problem. My stategy is to pursue variations of Atlanta Homes such as "Atlanta homes for sale" which is still a pretty highly used search phrase but doesn't seem to bring up those directory category phrases at the top of the SERPS.

I'm new at this so I'm interested to see if these observations might help an experienced optimizer to figure out what's going on.

More Traffic Please

4:16 am on Dec 4, 2003 (gmt 0)

10+ Year Member



As for the "site rank" theory - NOPE dont buy it either seeing too many sites that contain less than a dozen pages with PR of 3s and a HP of PR4 ranking in top 10 for highly competitive terms.

I'm guessing there have been many variables tweaked and possible filters added in the Florida update. However, the example you gave could be entirely consistent with some sort of a SiteRank variable added to the algo. For example, if these low PR sites with few pages happened to have incoming links from other low PageRank pages from large authoritative sites (high SiteRank sites), the SiteRank variable would be passed to them and your Google toolbar would never indicate it. In fact, depending on the PageRank of the inbound links, a backward link check may not even list these pages linking to site in question.

It was this scenario I was alluding to in msg #116.

defanjos

4:32 am on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



One trap we seem to fall into is to look for that one silver bullet that got us when in fact it may have been a load of buckshot.

Sure, there are a lot of things at play here.

These are the main things I have noticed so far, that will get you in trouble - applies to "filtered" keywords/keyphrases (KW) only:

- Exact KW in title
- KW in url (files or domain name)
- Exact KW in H tags
- Exact KW in body, especially if it is overly repeated
- Exact KW in anchor text pointing to page
- links to pages that look like siteinquestion.com/KW-something.htm, siteinquestion.com/KW-somethingelse.htm
- No Outbound links to other sites (critical)
(it seems if you link to other related sites, you are somewhat forgiven for over-optimizing)

Just a layman's observation

jim_w

4:38 am on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



defanjos

From what I have seen I would do better for my 2 KW's if I just had a blank page with the KW's listed once on the page and with the KW's in the title with 2 other unrelated words, change the name of my company to something non-releated to the space, and change the domain name to something as far away from the space as possible. Bet I would be in the top 1000 then ;-)

But I don't think I will let G dictate to me how to run my business.

rfgdxm1

4:50 am on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>No Outbound links to other sites (critical)
(it seems if you link to other related sites, you are somewhat forgiven for over-optimizing)

I'm not saying I agree with your analysis. However, if the above is one of the factors, this would tend to explain a lot why non-commercial sites tend to be relatively unaffected. Non-commercial sites tend to want to link to related sites. Commercial sites don't want to link to competitors.

too much information

5:06 am on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Let me throw a new twist to the thread. (I appoligise for posting a link but I believe it's relevant and much too long to copy and paste)

[cs.toronto.edu...]

I don't know if anyone has seen this before but it is a paper written by a Google research team member while in college about an experiment with a search engine algorythm.

It is very similar to what was described in the post a month or so ago about the recent patent application filed by Google which leads me to believe that it may be an explaination of what the new algorythm is. (print it fast, it may not stay online long)

It seems that there are different weights placed on the combinations of keywords in a search with an exact match being weighed the highest.

What is also mentioned is the fact that some sites are determined to be 'Expert' sites, and the qualifications for being an expert site are explained. PR is also a factor in this algo, and it is explained as well.

Now, from what I understand, and if this algo is new, they would first have to determine which sites should be considered 'expert' sites. Being a directory site, links page etc... would qualify as an expert site as long as the links on the page were not 'internal'.

Next you would have to calculate density values for sites based on their content for single keywords through multiple keywords. Given that Google scans over 2 billion pages (I think that is right) this could take a while even for the fastest of processors. Maybe even a matter of weeks?

What I think we are seeing is just the next wave of this algo going through the SERPs at a level that we can actually notice. I think that the next step will move those directory sites (the 'experts') to another level and allow those sites that the experts link to take their positions in the SERPs based on their relationship to the 'experts'.

Right now my theory is, if you have a search that is showing all directory and link sites, get on them now if you are not already listed. This is a golden opportunity to see where good solid links can be found.

If this algo is being used then it is it's own built in 'spam' filter because nobody that would qualify as an expert would have a link to a site that does not fit their content. Therefore there is no penalty, just a new algo that washes you out of the mix if you don't fit in. And what you are seeing is the middle of the change over.

Now I don't have any proof to back this up, but this is my theory from what I have read and what I am seeing. Right now I'm working on inbound and outbound links, and holding tight on my keyword density. I don't think the next wave will be too far off.

bzprod

5:16 am on Dec 4, 2003 (gmt 0)

10+ Year Member



Here is something that I have not noticed until now.

Pre Florida, I was on the first page for my main keyphrase. Two of my competitors were also on the first page. They both had been there (#1, #2) for at least 2 years. They are both from Maine, and appear in the regional Maine category.

When you type in "black widgets", two categories come up at the top of the serps:

-The "black widget" shopping category (this is the one that I am in)
-The regional Maine category

The funny thing is, both my competitors have been dropped hundreds of spots. Why is google showing the regional category at the top of the serps when it has nothing to do with "black widgets". The only tie that this category has to "black widgets" are my two competitors. PERIOD.

Why would they show the regional category? This cat has absolutely nothing to do with the keyphrase except the fact that my competitors have "black widget" sites, and are in the category.

This is very strange and a bit upsetting. Google is saying to users "Here are the two most relevant categories for your search." Yet they are not showing ANY sites that are in those categories!

When I do a search for "black widgets -qqwqwqwew", My two competitors and myself are #1, #2, #3. Also, most sites that are in the "black widget" shopping category are listed in the top 25.

When I do the same search without the "-qweqwewq" NONE of the sites in this category appear anywhere near the top.

Can someone please provide any detail?

willybfriendly

5:25 am on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hilltop is the first thing that has made sense in florida.

Unfortunately, authority sites within the model (as I understand it) may well be linkfarms.

Hilltop helps to explain why the site "Company name has been sold (company and widgets) and no longer exists" can rank as #1. The designer has carefully interlinked every site they ever created, and many (most?) of those sites are in the same or closely related industries.

While hilltop may not monetize Google directly, it will surely monetize the web. Suddenly a Yahoo Directory listing becomes important again. Business.com has value. Professional groups that charge dues, and in some cases additional fees for web listings, may well see their memberships swell.

How does Amazon get to the top in so many SERPs? "We define an expert page as a page that is about a certain topic and has links to many non-affiliated pages on that topic. Two pages are non-affiliated conceptually if they are authored by authors from non-affiliated organizations."

Picture the small time webmaster who tries to pick up a few bucks selling books off of their site. Under the Hilltop algo, these would be expert pages casting votes for Amazon.

The implications are immense, and deserve some serious consideration...

WBF

markis00

5:30 am on Dec 4, 2003 (gmt 0)

10+ Year Member



What I don't get is if the OOP does exist, why some sites which have optimized normally have been affected while others haven't. I have been monitoring a few sites which have stayed in the top 10 position throughout this entire mess, and they haven't changed a thing. Why are some keywords penalized, while others are not?

Furthermore, if I decide to go ahead and "de-optimize," someone said in another post that it's basically comparing the VHS to the beta. I de-optimize for Google and every other engine looks upon my site with disgust. What can I do...

Jazzy

7:03 am on Dec 4, 2003 (gmt 0)

10+ Year Member



Here's a prefect example of exactly what googles greed has done to millions of family businesses, entrepeneurs etc.
Just one example of millions:
You type in "worms" you get no difference in the results. You type in "buy worms"
and tiny little businesses that have played by the rules, worked hard for years to build and exchange links are wiped out!
By the way not my stuff at all, just am noticing tons of examples of this, trying to figure this mess out.

Look at the second site that would come up without this so called filter and tell me that's not greed GOOGLE SCROOGE. Just because the word buy triggers this. Makes me sick.
Please do not remove this post, I want people to see a good example of exactly what this is, and this is a good one. Hope there is investigations into this for people like that's sake. Millions of pages like this spammers or google money targets, you tell me. I'm ok I have other business.

LateNight

8:05 am on Dec 4, 2003 (gmt 0)

10+ Year Member



Perfect they hosed tiny businesses selling compost worms.
I guess Google is penalizing the notorious spammers in the compost industry. I hope the worm ranch does not have a mortgage.

[edited by: LateNight at 8:07 am (utc) on Dec. 4, 2003]

More Traffic Please

8:06 am on Dec 4, 2003 (gmt 0)

10+ Year Member



I read the Hilltop paper about a week ago for the first time and found it interesting, but I just don't think Google is using that technology in their algo. I believe the paper was written in 1999. As I read it, I could not help but think about how the inter connectivity of the Web has changed since then. I'm talking about the havoc reciprocal link pages between sites of a similar nature and keyword rich anchor text would cause on Hilltop today. It seems to me that this technology could easily mistake link pages as expert pages if the link pages had some external links pointing at them.

I also wonder how fast it would be after reading this quote.

In response to a query, we first compute a list of the most relevant experts on the query topic. Then, we identify relevant links within the selected set of experts, and follow them to identify target web pages. The targets are then ranked according to the number and relevance of non-affiliated experts that point to them.

Miop

8:30 am on Dec 4, 2003 (gmt 0)

10+ Year Member



In my business it *is* link pages are coming up before the sites. Pages and pages of them...

Jazzy

8:35 am on Dec 4, 2003 (gmt 0)

10+ Year Member



Could someone please tell me if these results are gonna hit on Yahoo? Also I know from here that yahoo is going to Inkintiomi eventually. Is that paid inclusion? Can someone direct me on how I can get listed there?

steveb

8:43 am on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"It seems to me that this technology could easily mistake link pages as expert pages if the link pages had some external links pointing at them."

This is certainly occuring now. High quality link pages, like the Yahoo/ODP/Google directory pages are ranking well, while trashy links pages are ranking relatively poorly, but still often ranking where they probably never should be ranking at all.

Whether it is what is going on or, Hilltop closely approximates the status quo. Even lines like the dated "The most virulent of spam techniques involves deliberately returning someone else's popular page to search engine robots instead of the actual page, to steal their traffic" suggest targeting duplicate content.

Update Hilltop to the link exchange reality of today's web, and you have a great concept for ranking sites.

Powdork

8:44 am on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Could someone please tell me if these results are gonna hit on Yahoo? Also I know from here that yahoo is going to Inkintiomi eventually. Is that paid inclusion? Can someone direct me on how I can get listed there?

Do webmasterworld a favor and go to positiontech by clicking on the link at the top of the page when its there.

BTW, It seems like you (WW) got caught with your pants down a little bit during dominic in terms of handling the volume. Not so this time around. Great job all.:)

Miop

8:50 am on Dec 4, 2003 (gmt 0)

10+ Year Member



Right, so if they are looking for an info 'hub', how should I put outbound links from my e-commerce site? Relevant ones on relevant pages? I used to have it like this (made sense to me) and then read somewhere that you could be penalised for having outbound links anywhere other than on a seperate links page (linked to from each page - this is what I have now).
Any advice please?

Powdork

9:02 am on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



you would never be penalised for outbound links. What they may have been referring to is that the number of links leaving your site from a page reduces the amount of pr that said page can transfer back to the rest of your site through your internal navigation. A simplified example would be,
Your navigation consists of 20 links from a page all to pages within your site. You are sending all your pr back to your site.
You add 5 links to the page that leave your site. Now, your internal navigation is only transferring 80%(20/25) of the pr back to your site.

And looking at it this way is focusing to much on pr IMO.

mat

9:09 am on Dec 4, 2003 (gmt 0)

10+ Year Member



Do webmasterworld a favor and go to positiontech by clicking on the link at the top of the page when its there.

powdork - you might want to take a moment to actually see where that link (and indeed all links in that corner) leads. Not what you think.

More Traffic Please

9:16 am on Dec 4, 2003 (gmt 0)

10+ Year Member



This is certainly occuring now. High quality link pages, like the Yahoo/ODP/Google directory pages are ranking well, while trashy links pages are ranking relatively poorly, but still often ranking where they probably never should be ranking at all.

If I understand Hilltop correctly, the idea is not to return expert pages (directory pages) to the surfer, it is only suppose to use them in the first step of returning relevant results. IMO if Hilltop was working correctly, there would be no reason for it to return directory sites. Even if it mistook the link exchange pages of today as expert sites, it would probably give a boost those sites that participate in link exchanges, not return directory listings.

Powdork

9:28 am on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Oops, I was wondering what happened to the the 'banner free' thing. I guess nothing happened to it.:)

what I meant to say but was originally too lazy was to go here [webmasterworld.com].

[edited by: Powdork at 9:31 am (utc) on Dec. 4, 2003]

marin

9:31 am on Dec 4, 2003 (gmt 0)

10+ Year Member



<you would never be penalised for outbound links.>

All top 10 in my area are full of absoulte links -even to internal pages ; could be this a strong factor?

steveb

11:47 am on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"Any advice please?"

In the future don't just listen to what you "read somewhere". Linking to other sites is sensible, user-friendly and a sign of a site that is confident in its own content (except in the cases of garbage anchor-text/link farmy stuff who don't exist for any other reason except to help other sites).

Miop

11:49 am on Dec 4, 2003 (gmt 0)

10+ Year Member



I am happy to link to other sites, and do so - I'm just not sure where to place the links!

Hissingsid

12:39 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



you would never be penalised for outbound links.

This is what Google actually says on this

"In particular, avoid links to web spammers or "bad neighborhoods" on the web as your own ranking may be affected adversely by those links."

Quoted from [google.com...]

Sid

PS I know not to believe everything I read just thought it was appropriate to quote it.

Please Be Gentle

1:25 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



Hi folks
I don't know where to put this, so please feel free to move.
I haven't been searching online much recently and I do not have a site so I cannot comment on whether the results have degraded or not, but, I did notice today on primetime BBC, a business programme called "Working Lunch" dedicated its first segment to the negative impact of the Google update in November. They gave a number of examples of companies dropping off the radar and of odd results. For instance when they entered a search term of "shelving" the first few results were of sites comparing prices for shelving, and I think that the eighth site was for the university of Iowa. They suggested, that to get more sensible results, the user should enter "shelving -waffle" and sure enough they got sites related to shelving. I don't know if anyone is interested in this or if it really means anything but it caught my eye as it was lunchtime BBC, and you rarely see anything like this.
Kind Regards
PBG

nutsandbolts

1:32 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yeah, I love that show. Online here [news.bbc.co.uk] - Video will no doubt work in a few hours as it's only just off air.

Stefan

1:35 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Miop, imho the outgoing links should occur naturally through the site, which often means in relevant text sections. If anything, putting them all on one links page might look more reciprocal and less valuable.

On a side note, I just noticed Google has updated changes to my pages and site this morning that it found a week ago... the update rolls on.

This 526 message thread spans 18 pages: 526