Forum Moderators: open

Message Too Old, No Replies

What The Early Research is Showing – Florida Update 2003

an analysis and aggregate of the current post-Florida update best practices

         

ryanallis1

9:14 am on Dec 3, 2003 (gmt 0)



I would welcome any comments and discussion on the following article (all URLs and specific keywords have been removed) that analyzes the current state of the Google update and suggests certain steps to take for both webmasters and Google...

Thank you,
Ryan Allis

On November 15, 2003, the SERPs (Search Engine Result Pages) in Google were dramatically altered. Although Google has been known to go through a reshuffling (appropriately named a Google Dance) every 2 months or so, this 'Dance' seems to be more like a drunken Mexican salsa that its usual conservative fox-trot.

Most likely, you will already know if your web site has been affected. You may have seen a significant drop-off in traffic around Nov. 15. Three of my sites have been hit. While one could understand dropping down a few positions, since November 15, the sites that previously held these rankings are nowhere to be found in the top 10,000 rankings. Such radical repositionings have left many mom-and-pop and small businesses devastated and out of luck for the holiday season. With Google controlling approximately 85% of Internet searches, many businesses are finding a need to lay off workers or rapidly cancel inventory orders. This situation deserves a closer look.

What the Early Research is Showing

From what early research shows, it seems that Google has put into place what has been quickly termed in the industry as an 'Over Optimization Penalty' (OOP) that takes into account the incoming link text and the on-site keyword frequency. If too many sites that link to your site use link text containing a word that is repeated more than a certain number of times on your home page, that page will be assessed the penalty and either demoted to oblivion or removed entirely from the rankings. In a sense Google is penalizing sites for being optimized for the search engines--without any forewarning of a change in policy.

Here is what else we know:

- The OOP is keyword specific, not site specific. Google has selected only certain keywords to apply the OOP for.

- Certain highly competitive keywords have lost many of the listings.

How to Know if Your Site Has Been Penalized

There are a few ways to know if your site has been penalized. The first, mentioned earlier, is if you noticed a significant drop in traffic around the 15th of November you've likely been hit. Here are ways to be sure:

1. Go to google.com. Type in any search term you recall being well-ranked for. See you site logs to see which terms you received search engine traffic from. If your site is nowhere to be found it's likely been penalized.

2. Type in the search term you suspect being penalized for, followed by "-dkjsahfdsaf" (or any other similar gibberish, without the quotes). This will remove the OOP and you should see what your results should be.

3. Or, simply go to www.**** to have this automated for you. Just type in the search term and see quickly what the search engine results would be if the OOP was not in effect. This site, put up less than a week ago, has quickly gained in popularity, becoming one of the 5000 most visited web sites on the Internet in a matter of days.

The Basics of SEO Redefined. Should One De-Optimize?

Search engine optimization consultants such as myself have known for years that the basics of SEO are:

- put your target keyword or keyphrase in your title, meta-tags, and alt-tags
- put your target keyword or keyphrase in an H1 tag near the top of your page
- repeat your keyword or keyphrase 5-10 times throughout the page
- create quality content on your site and update it regularly
- use a site map (linked to from every page) that links to all of your pages
- build lots of relevant links to your site
- ensure that your target keyword or keyphrase is in the link text of your incoming links

Now, however, the best practices for keyword frequency and link text will likely trigger the Google OOP. There is surely no denying that there are many low quality sites have used link farms and spammed blog comments in order to increase their PageRank (Google's measure of site quality) and link popularity. However, a differentiation must be made from these sites and quality sites with dozens or hundreds of pages of informational well-written content that have taken the time to properly build links.

So if you have been affected, what can you do? Should one de-optimize their site, or wait it out? Should one create one site for Google and one for the 'normal engines?' Is this a case of a filter been turned on too tight that Google will fix in a matter of days or something much more?

These are all serious questions that no one seems to have answers to. At this point we recommend making the following changes to your site if, and only if, your rankings seem to have been affected:

1. Contact a few of your link partners via email. Ask them to change the link text so that the keyword you have been penalized for is not in the link text or the keyphrase is in a different order than the order you are penalized for.

2. Open up the page that has been penalized (usually your home page) and reduce the number of times that you have the keyword on your site. Keep the number under 5 times for every 100 words you have on your page.

3. If you are targeting a keyphrase (a multiple-word keyword) reduce the number of times that your page has the target keyphrase in the exact order you are targeting. Mix up the order. For example, if you are targeting "Florida web designer" change this text on your site to "web site designer in florida" and "florida-based web site design services."

It is important to note that these 'de-optimization' steps should only be taken if you know that you have been affected by the Google OOP.

Why did Google do this? There are two possible answers. First, it is possible that Google has simply made an honest (yet very poor) attempt at removing many of the low-quality web sites in their results that had little quality content and received their positions from link farms and spamdexing. The evidence and the search engine results point to another potential answer.

A second theory, which has gained credence in the past days within the industry, is that in preparation for its Initial Public Offering (possibly this Spring), Google has developed a way to increase its revenue. How? By removing many of the sites that are optimized for the search engines on major commerical search terms, thereby increasing the use of its AdWords paid search results (cost-per-click) system. Is this the case? Maybe, maybe not.

Perhaps both of these reasons came into play. Perhaps Google execs thought they could

1) improve the quality of their rankings,
2) remove many of the 'spammy' low-quality sites
3) because of #2, increase AdWords revenues and
4) because of better results and more revenue have a better chance at a successful IPO.

Sadly, for Google, this plan had a detrimental flaw.

What Google Should Do

While there are positives that have come from this OOP filter, the filter needs to be adjusted. Here is what Google should do:

1. Post a communiqué on its web site explaining in as much detail as they are able what they have done and what they are doing to fix it;

2. Reduce the weight of OOP;

3. If the OOP is indeed a static penalty that can only be removed by a human, change it to a dynamic penalty that is analyzed and assessed with each major update; and

4. Establish an appeal process through which site owners which feel they are following all rules and have quality content can have a human (or enlightened spider) review their site and remove the OOP if appropriate.

When this recent update broke on November 15, webmasters clamored in the thousands to the industry forums such as webmasterworld.com. The mis-update was quickly titled "Florida Update 2003" and the initial common wisdom was that Google had made a serious mistake that would be fixed within 3-4 days and everyone should just stay put and wait for Google to 'fix itself.' While the rankings are still dancing, this fix has yet to come. High quality sites with lots of good content that have done everything right are being severely penalized.

If Google does not act quickly, it will soon lose market share and its reputation as the provider of the best search results. With Yahoo's recent acquisition of Inktomi, Alltheweb/FAST, and Altavista, it most likely will soon renege on its deal to serve Google results and may, in the process, create the future "best search engine on the 'net." Google, for now, has gone bananas in its recent meringue, and it may soon be spoiled rotten.

MikeNoLastName

9:15 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



By Gosh, I think MoreTrafficPlease, has touched on something. I was noticing something which this is a great explanation for. We have a number of domains inter-related (historically they came to be simply because we were overloading a single server, so had to create new domains on separate servers to accommodate the traffic) and along the way a page or two on each had been optimized for terms like "large blue widget", "small green widget", etc. Many of them were PR5 and in the top 10 under their highly competitive (and mostly commercial) keywords. We also had a primary domain which had one highly, externally backlinked page (not the home page) ranked in the top 20 under just plain "widgets" (a 5+ million result term) with a PR6. This one page has not moved in the rankings for "widgets" in a year. The HOME page on the SAME domain is only a PR5.
With Florida, almost ALL those 3 word (small green widget) pages disappeared but were replaced, often very nearly to the same rank or a couple up or down, by the ever-popular PR6 "widgets" page as long as "small green" was mentioned somewhere on the page at least once. In most cases the "mention" turned out to be a link to one of the other pages, but not always.
However, we're also seeing quite a few high ranks on the PR5 HOME page mentioned above which seldom saw the light of day before, which further makes me think they may have slightly increased the weighting for root domain pages as well.
In one related case we had a relatively new page (previously PR4 and now PR5) optimized for "medium brown widgets". It links a bunch of PR3 pages for "size 1 medium brown widgets", "size 2 ... mbw", etc. ALL which link right back. It was floating around 19-20 rank. After Florida, that same ever-popular PR6 page mentioned 3 paragraphs back, is now #4 for "medium brown widgets" and another (never before popular) PR5 page on the same domain which merely mentions and links the relatively new page is indented at #5. Meanwhile the relatively new "OPTIMIZED" "medium brown widget" page has dropped to about #25.
I'm definitely looking more at a "site PR" theory, which I guess could also go with the "authoritative site" idea. Since it seems, more "authoratative" sites which just "mention" or "link" to certain compound terms are scoring way above ones optimized for the terms themselves.
Either that or Google is now calculating rank for a compound keyword term based upon the combined, weighted average of results from each term, rather than as a combination (i.e. PR(kw1) + PR(kw2) + PR(kw3)/3, rather than PR(kw1+kw2+kw3) )

Mike

davidpbrown

9:22 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



superscript
The question is though, how many variations on the hypothetical stem 'widget' are now in place? - are there sufficient versions to explain the massive drop in top ranking sites we've observed?

I think stemming might well confuse the issue a little more than that..
there might be, for instance, new weightings involving not only the keyword but factoring how (ir)relavant a site is to the ~=words.

rfgdxm1

9:23 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Since it seems, more "authoratative" sites which just "mention" or "link" to certain compound terms are scoring way above ones optimized for the terms themselves.

Perhaps the PR knob on the algo has been turned way up.

jim_w

9:33 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>Perhaps the PR knob on the algo has been turned way up.<<

Well we are a PR5, I would think if that was it we would rank above PR3, but we are not.

More Traffic Please

10:08 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



rfgdxm1 said

>Since it seems, more "authoratative" sites which just "mention" or "link" to certain compound terms are scoring way above ones optimized for the terms themselves.

Perhaps the PR knob on the algo has been turned way up.

The problem with just saying the PR knob has been turned up is that very low PR pages in sites that have an index with a very high PR seem to be doing extremely well with only one occurrence of a KW phrase on the page. This is in SERPS with higher PR pages for that phrase ranked below them. If we introduce the idea of a SiteRank as I mentioned in message #75, it could account for this quite well. It would not answer all questions by any means, but I think it's worth looking at as a possible piece of the puzzle.

Goanna1

10:13 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



>i found the major competitor in my category to have a KW density of 38%

Yes, I have seen affected sites with KW densities of less than 1% and greater than 10%. KW density is not a factor IMHO.

steveb

10:15 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"The reason it did good before, was the anchor text"

Which means the site could easily be dropped from the top 1000. Anchor text means next to nothing as a measure of quality. Sites that ranked highly in the anchor text algo strictly because of their anchor text will go down now. The algo now isn't 100% anchor text. If a site has nothing else going for it, it will go down.

Of course the value of PR has gone up... basically *everything* has gone up except anchor text (everything except maybe page title). Google is much more valuing sites with multiple "ranking streams" (think revenue streams). Much more than before sites now need good linking coming and going, good titles, good headers, good on page keyword use, etc.

The Google suggestions for making a good site are much more accurate now than they were before. They certainly aren't getting everything right, but the answer about "best practices" for optimizing is clear: play by their rules and optimize everything you can think of.

SEO just got a lot more important. Any idjit could make tons of anchor text. Now there is so much much more to optimize... and of course it is all the stuff people should have been focusing on in the first place.

jim_w

10:24 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Here is what I am afraid of. If your site and company name is ‘used-cars-r-us’ and that is the anchor text in all inbounds, then G decides that ‘used cars’ are the targeted KW’s, your $crwed, and I mean without a kiss.

Kirby

10:30 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Here is what I am afraid of. If your site and company name is ‘used-cars-r-us’ and that is the anchor text in all inbounds, then G decides that ‘used cars’ are the targeted KW’s, your $crwed, and I mean without a kiss.

This is thinking in terms of a penalty, which I doubt Google would ever apply to anchor text.

More probable is that the value of anchor text is diminished - no added benefit for anchor text. Not $crwd, just ignored. ;)

BallochBD

10:30 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



I just noticed something which I think may be significant. While doing some research I noticed that when searching for my KW most of the results were not significant to the point of being absolutely cr*p! I then noticed that the Adwords were all highly significant (of course). I was about to click on one of the Adwords when the penny dropped. Does this not reinforce the theory that this whole exercise is all about boosting Adwords? If you cannot get decent results on a sensible search (other than in Adwords) will this not force people not to look at the white space but to look at the Adwords? Subliminal or what?

SuperSport

10:34 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



Back to the original OOP issue, especially related to incoming links with kw text in them. I just checked incoming links that Google is recognizing currently and all of my external incoming links that had kw text in them are no longer returning results.

I have no reciprocal links what so ever, and my link page only hooks up to huge information or commercial sites, so unrelated to my industry.

I also concur that keyword stuffing isn't the issue, at least not for all sites. A somewhat competitor that passes itself off as a directory (an expensive paid inclusion directory) is stuffed to the kills with kw's. An example from their index page (kp= keyword phrase):

"KP1 text KP2 text.
text text text text text text KP1, KP2, KP3, text KP4 text text text KP2 text text , text text text KP4 text text. text text text text text KP1 text text text KP3."

This site was, and is still dominating the SERPS after Florida. Another competitor, which tries to pass itself off as a forum has benefited as well, despite some quasi hidden text way at the bottom of the index page.

I have also found like others, that index pages seem to be getting ignored much more frequently, while deeper pages are being brought out increasingly.

guynouk

10:37 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



with reference to my message #69 - can anybody work out what the threshold might be?

B_walker

10:38 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



Speaking only for one site I manage, what I'm seeing is a heartbreaking drop in search position to almost oblivion over the past week.

From checking the keywords that were regularly placing or main page on the first Google page for the last 6 months, my conclusion at the moment is that the sites coming up on the the first page are there due to Page Rank.

These sites receive a lot of traffic for those keywords and they are what Google is placing high. Whereas I can understand why G would be doing this, I think G is too wise by half. Yes, whereas it appears that the purely SEO sites are no longer on the top pages, pages that happen to be excellent in terms of content, due to also being small and infrequently visited, are essentially gone.

I'd like to think that our site (non-commercial, informational) is a good site. Relative to the keywords that are ours the content is good; however, it's not a popular site due to being a generalist site about a specific area (it just happens to be the way we understand the subject area).

For people seeking the kind of information the site provides well, it's useful and enjoyable, but because the larger generalist sites also cover the subject area, and in turn have a high PR due to their wide range of information offered on a wider area that our keywords are also in, these sites have now jumped to the head of the list while other worthy but poorly page ranked sites are essentially so low that they don't exist from a practial sense.

I think this is a disservice to searchers who are looking for sites such as ours. Yes, the super optimized sites designed to receive a top listing although their content was poor, have now been beat shown Google is boss, but seems to us they've thrown the baby out with the bath water.

For sites lke ours that are not there to generate income, and in fact lose money, but which receive "payment" in satisfaction of communicating with others, if in fact PR seems to be the make or break criteria now, the sites will have to be redesigned precisly to try and optimize for the new Google. This is exactly what non-profits don't want or need. However, if the above is correct, it's back to the drawing board or bye-bye.

We would have prefered to spend time on content and not on "staying alive" vis Google. I actually think that what we'd do to get back in the game is rework our content to make it very specific to one area; but there seems to be something strange to that concept. Our content would be designed to satisfy Google's black box. Do we really need this? Right now, we're real disappointed. Lots of work seems to have down the drain. Thanks Goog.

jim_w

10:40 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>Not $crwd, just ignored<<

It’s semantics. If your ignored for 2 KWs that make up 50% or more of your hits, then believe me, if your food intake depends on those hits, you are $crwd

MetropolisRobot

10:44 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



PageRank is not the sole determining factor either. In my category there a PR5 sites on page 1, and we are a PR 5 site that has been blown outta the water so make of that what you will.

Miop

10:45 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



Pennies are dropping...some pages seem to be high ranking with no anchor text back links at all...just the keyword on the same page as a hyperlink to the ranking site...the keyword is in the title...it's going to click very soon I hope...
if I didn't have a terrible cold, I'd be thinking about something much more interesting!

SuperSport

10:48 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



I agree that PR doesn't seem to be the issue either. My site has about 90% of our pages a PR6, with the other 10% being PR5. Competitors and other sites who are now there are no higher in PR and even lower.

MetropolisRobot

10:51 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



For those just joining, a small summary:

(1) Seems that KW density "alone" is not the causality as we have various reports of high % KW Densitiy sites remaining tops

(2) in bound link penalties. Seems that this would be open to abuse by parties out to sabotage good sites

(3) PR filters. Looks like this "alone" is a non starter as some people have had their moderate PR sites blown out when other sites with moderate PR have been promoted.

rfgdxm1

10:52 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>This is in SERPS with higher PR pages for that phrase ranked below them. If we introduce the idea of a SiteRank as I mentioned in message #75, it could account for this quite well.

This is an interesting theory. Note also that the concept of SiteRank as it being authoratative could go beyond just the home page PR. PR was Google's original attempt at measuring "authority" of a page that is over half a decade old. Google may have figured out other ways to establish the authoratativeness of a site. Say for one a lot of links from .edu domains. The core idea behind Google was that off page factors which established a page as an authority would be considered in ranking. And to the public, a page from CNN or the BBC coming up high in a SERP will likely scream out "credible" more than one of my amateur sites. And, even if this SiteRank is based on home page PR, it's gonna be rather difficult for a spammer to get the home page of his sites to a PR9 like cnn.com. MUCH easier to manipulate inbound anchor text than it is to get a huge PR for the home page of a site.

frup

10:57 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



I definitely think there is site rank at work. In one of the categories I follow it is the top-ranked sites that are coming up for many many searches.

jim_w

11:01 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yea, but what about sites that G dosen't have in their index. For example, my NASA site inbound link is not in the G database. I just looked. And there are lots of university and .gov pages that G does not have indexed. Are they that bold that they think they have every page in the world indexed correctly? I know they don't and can provide examples.

MetropolisRobot

11:05 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



they (google) do not have to have all the sites in the world ranked correctly.

Google is a business and must act like a business. If the search results are good with the customers and customers return to search again and again then all is well in google-land.

Only if the customers WERE not getting what they wanted, and to be quite honest (a) how would they know and (b) who are we to say they are not getting the right results, would Google be "failing".

caveman

11:06 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



More probable is that the value of anchor text is diminished - no added benefit for anchor text. Not $crwd, just ignored. ;)

This assumes no filtering going on (I think).

But an abundance of identical or nearly identical anchor text, combined with a series of high scores for other SEO factors, might still be part of the index page problem.

If a site is all about green widgets, and hundreds of pages are all about green widgets of all sizes and kinds, and all those pages are showing in the SERP's, but the homepage is sunk (even though it has quite a few inbound links from related sites, good kw density, title and H1, etc.), just discounting the links would still leave the homepage as the homepage of a site all about green widgets, and therefore very relevant to a search for green widgets...

A drop from page one to page six as a result of discouting those links, maybe. A drop from page one to page 168...unlikely.

jim_w

11:12 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>they (google) do not have to have all the sites in the world ranked correctly<<

No they don't. I can give you a NASA link right now from NASA's hq, at least that is part of the url, that is not in G.

Kirby

11:14 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If you start with siterank as the major algo tweak, followed closely by a devaluation of anchor text, it can explain why a lot of relevant results can get intially pummeled.

The problem I had with keyword penalties was the exceptions I kept finding. I attributed the exceptions to be that Google was mistaken these sites as "authority" sites, even though they really arent. SiteRank provides a pluasible explanation for their survival.

More Traffic Please

11:16 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



When thinking about the possibility of SiteRank, not only would a low PR page from a high SiteRank site show up well in the SERPS for a phrase on it, but a link from this same low PR page could be much more important than an incoming link from a higher PR page on a less authorative site. The thing is, we will only see the toolbar PR on the page that receives the incoming link. This could account for a great deal of the confusion.

Kirby

11:18 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



caveman, I am assuming that the devaluation of anchor text is just one of many tweaks that were made. One trap we seem to fall into is to look for that one silver bullet that got us when in fact it may have been a load of buckshot.

More Traffic Please

11:25 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



"One trap we seem to fall into is to look for that one silver bullet that got us when in fact it may have been a load of buckshot."

Well said

rfgdxm1

11:26 pm on Dec 3, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>I attributed the exceptions to be that Google was mistaken these sites as "authority" sites, even though they really arent. SiteRank provides a pluasible explanation for their survival.

With the original Google algo, PR was the *sole* measure of authority. They saw this as the democratic voting of the web. SiteRank would just be a refinement of this, as the original papers thought of PR as being a page by page phenomena. If Google is implementing SiteRank, aggregate PR of a site may be just one element of it. For example, lots of .edu and .gov links to a site could be also seen as indicative of its authoratativeness.

Miop

11:29 pm on Dec 3, 2003 (gmt 0)

10+ Year Member



The only thing I can see that looks certain at the moment is
1)inbound links are pages with the hyperlink to the ranking site on it - there often isn't even any anchor text.
I have checked my backlinks on google - I haven't checked all the internal links on my site (cos I've been fiddling about with them so I'll have to check the cached links...) - all the ones with anchor text seem to have gone (when the backlinks were updated and PR went up).
2) the target page which is ranked sometimes doesn't even have the kw on it - it could only have been found with a hyper link - with no anchor text, the site could only be considered relevant with regard to the page which is is linking from and the text thereon.

I'm going to check some of my former links to see if they have anchor text, if I can find them.

This 526 message thread spans 18 pages: 526