Forum Moderators: open

Message Too Old, No Replies

What's the fuzz all about?

Sites that get boosted after Florida

         

DrDoc

6:28 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The last few weeks, I swear over half the posts across Webmaster World have been in the neighborhood of "Google sucks after the 'Florida' update". Is that true? Well, I did some research, and found a very interesting trend. Let's look at some of the different page categories and compare before vs. after 'Florida'.


Category One: Highly optimized sites

Before 'Florida', these sites usually had very high page rankings. They were all fighting about the top SERPs. Many were tweaked and tweaked to acheive the ultimate SE (in this case meaning 'Google') ranking. These pages ranked well on other search engines as well.

After 'Florida', however, these sites took a dive on Google. Now, the interesting thing is that they still rank well on most non-Google powered SEs!

What does that tell us? Is Google penalizing SEO? I wouldn't say that. It looks, however, like highly optimized sites, sites targetting Google's algo, no longer benefit from that. In other words, sites optimized for Google seem to have lost their advantage.

To better understand this, let's take a look at the next category.

Category Two: Plain optimized sites

First, what's the difference between a 'highly optimized' and just an 'optimized' site? Well, there are some basic steps to optimization that should be involved in all site development, such as proper use of headings, paragraphs, and other HTML elements to properly mark up the content for what it is -- 'the right tool for the right job'. Clean search engine friendly code.

Then, there are sites that go beyond this basic optimization, sites that were tweaked against Google's algo. These are the highly optimized sites, the ones that took a noticable dive post 'Florida'.

What does this tell us about the post 'Florida' Google, and about Google's new algo? We come back to that after taking a look at the last category.

Category Three: Non-optimized sites

Sites in this category are usually home brewed personal Web sites. They are more often than not built using one of many WYSIWYG editors out there. But, what sets these sites apart from the rest is their purpose. Usually, whoever built them, had no intentions of getting high rankings. Most likely he/she is not a professional, or has a very limited user base, and would not benefit from SEO. These sites were pretty much unaffected by the 'Florida' update, as expected, and will therefore not be included in the further analyze.

Analyze: Trying to sort out the WHYs

Let us first look at a quick summary of the difference between search engine results before and after 'Florida'.

Before 'Florida'

Category One:
Ranks well on all search engines
Ranks extremely well on Google

Category Two:
Ranks well on non-Google search engines
Still ranks well on Google, but is not fighting about the top SERPs for more general keywords

After 'Florida'

Category One:
Ranks well on all search engines
Google results seem to be worse

Category Two:
Ranks well on all search engines
Google results seem to be better

What does this tell us? For some reason, there is no longer a difference between one of those 'highly optimized' and a plain 'optimized' site -- they fight about the same SERPs, and are ranked on the same level. What did Google change? Well, there has been a lot of discussion on the board about whether you get penalized for what your incoming links look like, or whether Google penalizes you for this and that. Is that really it? Or is there more to it?

Google's FR (Fair Ranking) Algo

Imagine that there never were any 'highly optimized sites', that no sites were targetting Google more than other engines... Everyone would just optimize their sites using basic techniques like proper use of HTML elements, focus on content instead of keyword placement... What would the SERPs look like? I believe, contrary to most people's belief, that we would have had the same results we see today. To be more frank and to the point -- I think there was a bug in Google's algo before. Now, Google has yet again tweaked their algo. Why? To return better and more accurate search results. And that's why some sites appear to have taken a dive, or as some would put it, why 'some sites have been penalized'. While that may be true for some sites, if your site still appears in Google's index at all, it is not penalized. Instead, it has lost its unfair advantage over other sites.

Google has always proclaimed that to rank well, you should do the following:
• mark up your HTML content using the HTML tags (headings, paragraphs, etc) the way they were intended
• use clean code, not too markup heavy
• focus on content, content, and more content
• write for humans, not spiders
• use descriptive page titles, file names, and link texts
• if using dynamic pages, make sure they can be spidered
• focus on the content some more

For some reason, that seems to be more true today than yesterday. Pages that do all of the above -- nothing more, nothing less -- rank well. They ranked well before 'Florida', and they still do. And, they probably always will.

Is the competition gone? No, it's tighter than ever. But your tweaking skills will no longer determine your ranking. Instead, your content will.

willybfriendly

9:17 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Good post overall. But, as everything related to Florida, there are major exceptions.

In my niche there are a large number of category 3 sites that have risen to the top 20 across a broad range of search terms. Also, there are at least two sites whose entire content is "This business is no longer in business".

So, the new algo can reward non-optimized, home-brewed sites, and content is not always king.

These anomolies intrigue me, as it would seem that they will do a better job of illuminating important aspects of the algo than the relevant sites that show up.

WBF

notawebmaster

9:24 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



Regarding the option of placing the word "buy" in a KW search: it makes no difference for many searches. However, when I perform a KW1, KW2, KW3 search and slightly misspell KW3, I get what I consider more valid results. I realize that "valid" or "relevant" can be very subjective but I refuse to believe that someone doing a search for "specific" "type" "software" is really looking for directories that may or may not have a similar "specific" "type" "software".

Also, if I use the word "buy" before the same KW string, no real changes to the serp's are apparent. And since when did the word “buy” become a prerequisite when looking for a particular product? What if I am not looking to buy at the moment but I am in the information gathering stage, should I put the term “looking for info for about” before my search string? This is ridiculous. If google knows that KW1, KW2, KW3 searches are generally used to search for a product, as evidenced by the fact that they are able to display relevant adwords, why not display results that include companies that supply that product?

Look, if I search for X on ten search engines and nine out of 10 are returning results that I am happy with and the loner SE displays results that make me think it’s 1997, I gotta think that the loner is having problems. Unless in the new ES paradigm, paid inclusion spammy directories are indeed “authority sites”.

</rant>

(edit- I'm sorry I have nothing to offer other than frustration.)

deanril

9:48 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



Well when I did the "buy golf clubs" vs. "golf clubs" I liked the results, I found places to buy them, not like "golf clubs -showmetherealresults" but a lot better then just golf clubs.

So Im inserting buy and sell now all over my sites on the pages that got dinked, every little bit helps.

mikeD

9:50 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



Still seeing big changes in the serps in the UK, anyone else?

drewls

10:01 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



The man had said it all. Would you please leave the specifics/conspiracies to other florida-update-mess threads?

The man also posted it on what's known as a 'message board' or 'forum', where others would be free to comment on it. If he didn't want input, he should have stapled it to his basement wall instead. :D

ogletree

10:10 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It is much more complicated than that. We can all give you examples where your theory falls apart. Most of what you say is more of a symptom of some other algo not the actual algo. People seem to have a hard time understanding that what you are seeing may be a side effect and not the effect. If what you are saying is true then why are people still number one for terms where people did a lot of SEO. I saw sites that had no SEO get dumped.

m2c1r

10:15 pm on Dec 5, 2003 (gmt 0)

10+ Year Member



I wish content was key. My "super blue widgets" site has 50 pages of content, and only one page selling our version of a "super blue widget". The next closest relevant site, commmercial or otherwise, has fewer than 10 pages of content.

So, whether a searcher was looking for general information (we don't just have 50 pages of sales copy- our articles are of a general nature on the overall subject) or to buy a super blue widget, we offered by far the most info. And yet post-florida I am out of the top 100.

If content was the key, then I should still be in the top 10 somewhere, no? Or, adding even more content should bring me back. But I don't see this happening, because whatever caused the drop in the first place surely wasn't just comparing the value of the content, or even the quantity.

So, how do I show google I have the best "super blue widgets" site, if high relevance and lots of content aren't enough?

Kirby

10:27 pm on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Chndru, Thank you for actually putting it to the test. You are right about those listings. They are relevant, but they are also very different than pre-florida. Now look at page two and notice the tennessee and mississippi pages of the same directory. This is what Im talking about as well. Directory listings now dominate real estate searches, and once you get past page 1, the results are not very good. So even if you have good relevant directories on page one, why the inclusion of so many worthless directories further on? Isnt it a bit odd?

deanril, again I'm not saying that the majority of google's searches are of excellent quality. I used that one to illustrate the fact that they do exsist, though. My point is that they can deliver these good results, without a bunch of directory links to more backpacking gear sites. So why are some searches now weighted to directory type sites, many of dubious relevance, and others are pre-florida?

The top 3 contenders for the answer to this question are:
themeing
stemming
money word penalty

Any other ideas to help explain the results, not just rate them?

Dave_Hawley

1:32 am on Dec 6, 2003 (gmt 0)



Great sensible down to earth post DrDoc, well done!

Unfortunately you will be ignored/abused by most as you have not made things complicated enough.

I like you observation on Frontpage sites! These types of site are normally set-up by someone who know their *subject well*, but NOT SEO, Web design, HTML, Java etc. They keep their pages to the point and have lots of text. They do not try sneaky tricks and often subscribe to the KISS (Keep it simple stupid) acronym.

Dave

Kirby

6:05 am on Dec 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I like you observation on Frontpage sites! These types of site are normally set-up by someone who know their *subject well*, but NOT SEO, Web design, HTML, Java etc. They keep their pages to the point and have lots of text. They do not try sneaky tricks and often subscribe to the KISS (Keep it simple stupid) acronym.

Sorry, wrong answer. Several frontpage real estate sites pre-florida ranked well for several "city real estate" searches are now no longer in top 250. In fact, for many of these searches, you will not find any of these kind of sites in top 100, period.

Please try again, or can you simply just acknowledge that for these searches, Google has decided that only directory type sites are acceptable?

markus007

6:58 am on Dec 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I optimized a real estate site in october and its got a ton of #1 rankings for competitive keywords now.. Did the same thing to a lesser extent on another site and it completely vanished.. There is no pattern so far that i can see...

ogletree

7:09 am on Dec 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



What he and others are saying Dave is that it is easy to prove your theory wrong.

Powdork

7:15 am on Dec 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Disclaimer! the following is my opinion and should not be taken any other way.
I think there is still some testing going on. For instance, while following the cross linking clearly evident on the sites at the top of the serps for my locale I took a visit to the palm springs hotel scene as well as the new york hotel scene and others. Here are the pertinent things I saw. Palm springs seems to be the type of location that this helped. i liked the results better although I am not intimately familiar with the sites and serps as in my area. The New York Hotels search seemed to be rather unaffected (and spammy). I think a generalization would be that this new algo helped many of the very competitive searches and hurt those that were competitive, but not overly spammy. When you remove a lot of affiliate sites based on nothing but a URL and Google, it probably improves the results. When you start messing with serps that have a limited amount of relevant results (limited by the actual number of brick and mortar businesses), Joe Surfer will notice because there is nothing relevant to the search to replace them.

And back to my original point. i think the reason we are hearing about tweaks, and sites returning is that G is flipping the switch with certain keyphrases to see what the general public's feedback is.

claus

2:15 pm on Dec 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The past three-four days i have seen movement in the serps. I reported it last night in this thread (#304, p. 21) [webmasterworld.com], only to wake up today and find another 7 pages of previous posts from other threads added to the thread.

These 7 pages of earlier posts were by and large already factored into the discussion on page 21, so they add only a few significant findings. I would recommend people reading up to page 21 of that thread, as there are some valuable insights there - and certainly some buring of myths (of which some reappear in the pages following p. 21). Then, let's continue the discussion on page 28 and onwards.

/claus

Kirby

4:54 pm on Dec 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



<editted by Kirby - moving to pg 28 as claus suggested>

ciml

5:31 pm on Dec 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> What's the fuzz all about?

Well DrDoc, a lot of rankings changed, and those who do badly at updates tend to make a lot of noise. This is nothing new. The fuss is greater because there was more change than normal.

The distinction you make between "Highly optimized sites" and "Plain optimized sites" makes a lot of sense. A lot of highly optimised sites have been affected, but I think we do need to go further than the 'no longer given a boost for high optimisation' idea.

As layer8 points out, these changes are too drastic for that. Also, you can toggle the filter off as synergy started to uncover in part 2 of the Florida threads.

allanp73, I suspect you'd be a happier webmaster if you worried less about how you can topple Google's position as the top search engine, convince the world not to buy Adwords or force Google to reconsider its lastest update.

Greater Google happiness come in part from determining how Google's filters work, which you point out. Publicising circumvention methods would make a lot of webmasters happier (mostly spammers as they tend to put a lot of emphasis on escaping penalties) but the best thing that you and I can do to make ourselves happier is work on our own sites.

kirby:
> ... or can you simply just acknowledge that for these searches, Google has decided that only directory type sites are acceptable?

Could it be that some kinds of directory sites have attributes that the new Google likes, or that they don't have the attributes that trip the filters? The answers to what happened at Florida have a lot to do with large scale automatic filters, and very little if anything to do with someone at Google drawing up lists of sites (I think we had more of that over the previous months).

Powdork

5:47 pm on Dec 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Another fuzzy thing is the removal of local sites from the serps and I think Chndru's example shows this quite well(somehow a local realtor squeaked through on the HER site). Those are not the sites I would be looking for. And page two is ridiculous.

OTOH
$kwrugel wrox!

Kirby

5:52 pm on Dec 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Could it be that some kinds of directory sites have attributes that the new Google likes, or that they don't have the attributes that trip the filters? The answers to what happened at Florida have a lot to do with large scale automatic filters, and very little if anything to do with someone at Google drawing up lists of sites

I believe so, ciml. It is a dramatic change though, and that was the point I was trying to get a cross.

I believe the knob was turned a bit too much.The sheer number of directory sites have knocked the majority (over 80%) of pre-florida results out of the top 100 and most dont reappear until the mid 200s. Not a problem if all these directory sites were relevant, but after the first few, what remains is questionable. Many of these results are returned because of a single link to one of the pre-florida reults. Why the need for the middleman?

ciml

6:50 pm on Dec 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> I believe the knob was turned a bit too much.

Usually, a new filter/penalty is played with a little; this one is not as new as most people think but so few sites were affected that very few people seemed to notice it. Then it's cranked up very heavily, then it's relaxed. I don't see why this one will be any different.

The only big question I have about these filters is whether some of the highly optimized sites affected will come back to 'normal' positions, or whether they'll all come back to a half-way position.

DrDoc

9:54 pm on Dec 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



That's why I said there was a bug in the new filters. If there wasn't, then one category would not be doing great and the other suck :)

And, think what you want... but content is half the game! That will never change.

trillianjedi

10:04 pm on Dec 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



content is half the game! That will never change.

Amen!

TJ

Kirby

10:21 pm on Dec 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have to agree with you on both points DrDoc. The irony here is that it appears the bug is keying too much on content. Let me explain.

The results for several industries are still returning primarily directory/authority/hubs or sites google thinks fits these catagories. While onpage content seems to be driving these results, G is having a problem divining the importance of the content.

The serps are still loaded with results based only on content in or close proximity to outbound links. I am seeing at #4 a naples real estate site show up for a search for real estate in the NE part of the US, and a west coast site be #10 for a search for New Jersey real estate. I'm seeing 1/2 of the top 10 results for one "city + kw" search there only because of links to the sites that used to occupy the top 10 pre-florida, while the pre-florida sites are still buried 200+ deep.

While google is spewing out results based on content, for many searches the results' only relevance is the outbound links. This cant be what Google expected and thinks is acceptable.

caveman

10:21 pm on Dec 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It is much more complicated than that. We can all give you examples where your theory falls apart. Most of what you say is more of a symptom of some other algo not the actual algo. People seem to have a hard time understanding that what you are seeing may be a side effect and not the effect. If what you are saying is true then why are people still number one for terms where people did a lot of SEO. I saw sites that had no SEO get dumped.

Well said.

Like many of the better post Florida posts the opener to this thread was laced with interesting theories.

Clearly, anti-spam was a goal of this update. Clearly the algo elements/filter elements were turned way up this time. Clearly they were in test previously as a few noted months ago.

And clearly there are many clear exceptions, which as we all know, disproves the theory.

Those assuming that if their sites are fine then the victims must be spammers to one degree or another...well, it's a bit silly. Fact is we're aware of all sorts of nice little sites and big money sites that are still lost. Many were spamming. Many were not.

To suggest that because you can't see it it doesn't exist is unbecomming of an officer and a gentleman.

Let's have a bit more patience and respect for those who actually follow Brett's rules (knowingly or otherwise) and still feel the pain right now.

superscript

12:43 pm on Dec 9, 2003 (gmt 0)



unbecoming of an officer and a gentleman

You're one sophisticated caveman, Caveman ;)

This 54 message thread spans 2 pages: 54