Forum Moderators: open

Message Too Old, No Replies

What The Early Research is Showing – Florida Update 2003

an analysis and aggregate of the current post-Florida update best practices

         

ryanallis1

9:14 am on Dec 3, 2003 (gmt 0)



I would welcome any comments and discussion on the following article (all URLs and specific keywords have been removed) that analyzes the current state of the Google update and suggests certain steps to take for both webmasters and Google...

Thank you,
Ryan Allis

On November 15, 2003, the SERPs (Search Engine Result Pages) in Google were dramatically altered. Although Google has been known to go through a reshuffling (appropriately named a Google Dance) every 2 months or so, this 'Dance' seems to be more like a drunken Mexican salsa that its usual conservative fox-trot.

Most likely, you will already know if your web site has been affected. You may have seen a significant drop-off in traffic around Nov. 15. Three of my sites have been hit. While one could understand dropping down a few positions, since November 15, the sites that previously held these rankings are nowhere to be found in the top 10,000 rankings. Such radical repositionings have left many mom-and-pop and small businesses devastated and out of luck for the holiday season. With Google controlling approximately 85% of Internet searches, many businesses are finding a need to lay off workers or rapidly cancel inventory orders. This situation deserves a closer look.

What the Early Research is Showing

From what early research shows, it seems that Google has put into place what has been quickly termed in the industry as an 'Over Optimization Penalty' (OOP) that takes into account the incoming link text and the on-site keyword frequency. If too many sites that link to your site use link text containing a word that is repeated more than a certain number of times on your home page, that page will be assessed the penalty and either demoted to oblivion or removed entirely from the rankings. In a sense Google is penalizing sites for being optimized for the search engines--without any forewarning of a change in policy.

Here is what else we know:

- The OOP is keyword specific, not site specific. Google has selected only certain keywords to apply the OOP for.

- Certain highly competitive keywords have lost many of the listings.

How to Know if Your Site Has Been Penalized

There are a few ways to know if your site has been penalized. The first, mentioned earlier, is if you noticed a significant drop in traffic around the 15th of November you've likely been hit. Here are ways to be sure:

1. Go to google.com. Type in any search term you recall being well-ranked for. See you site logs to see which terms you received search engine traffic from. If your site is nowhere to be found it's likely been penalized.

2. Type in the search term you suspect being penalized for, followed by "-dkjsahfdsaf" (or any other similar gibberish, without the quotes). This will remove the OOP and you should see what your results should be.

3. Or, simply go to www.**** to have this automated for you. Just type in the search term and see quickly what the search engine results would be if the OOP was not in effect. This site, put up less than a week ago, has quickly gained in popularity, becoming one of the 5000 most visited web sites on the Internet in a matter of days.

The Basics of SEO Redefined. Should One De-Optimize?

Search engine optimization consultants such as myself have known for years that the basics of SEO are:

- put your target keyword or keyphrase in your title, meta-tags, and alt-tags
- put your target keyword or keyphrase in an H1 tag near the top of your page
- repeat your keyword or keyphrase 5-10 times throughout the page
- create quality content on your site and update it regularly
- use a site map (linked to from every page) that links to all of your pages
- build lots of relevant links to your site
- ensure that your target keyword or keyphrase is in the link text of your incoming links

Now, however, the best practices for keyword frequency and link text will likely trigger the Google OOP. There is surely no denying that there are many low quality sites have used link farms and spammed blog comments in order to increase their PageRank (Google's measure of site quality) and link popularity. However, a differentiation must be made from these sites and quality sites with dozens or hundreds of pages of informational well-written content that have taken the time to properly build links.

So if you have been affected, what can you do? Should one de-optimize their site, or wait it out? Should one create one site for Google and one for the 'normal engines?' Is this a case of a filter been turned on too tight that Google will fix in a matter of days or something much more?

These are all serious questions that no one seems to have answers to. At this point we recommend making the following changes to your site if, and only if, your rankings seem to have been affected:

1. Contact a few of your link partners via email. Ask them to change the link text so that the keyword you have been penalized for is not in the link text or the keyphrase is in a different order than the order you are penalized for.

2. Open up the page that has been penalized (usually your home page) and reduce the number of times that you have the keyword on your site. Keep the number under 5 times for every 100 words you have on your page.

3. If you are targeting a keyphrase (a multiple-word keyword) reduce the number of times that your page has the target keyphrase in the exact order you are targeting. Mix up the order. For example, if you are targeting "Florida web designer" change this text on your site to "web site designer in florida" and "florida-based web site design services."

It is important to note that these 'de-optimization' steps should only be taken if you know that you have been affected by the Google OOP.

Why did Google do this? There are two possible answers. First, it is possible that Google has simply made an honest (yet very poor) attempt at removing many of the low-quality web sites in their results that had little quality content and received their positions from link farms and spamdexing. The evidence and the search engine results point to another potential answer.

A second theory, which has gained credence in the past days within the industry, is that in preparation for its Initial Public Offering (possibly this Spring), Google has developed a way to increase its revenue. How? By removing many of the sites that are optimized for the search engines on major commerical search terms, thereby increasing the use of its AdWords paid search results (cost-per-click) system. Is this the case? Maybe, maybe not.

Perhaps both of these reasons came into play. Perhaps Google execs thought they could

1) improve the quality of their rankings,
2) remove many of the 'spammy' low-quality sites
3) because of #2, increase AdWords revenues and
4) because of better results and more revenue have a better chance at a successful IPO.

Sadly, for Google, this plan had a detrimental flaw.

What Google Should Do

While there are positives that have come from this OOP filter, the filter needs to be adjusted. Here is what Google should do:

1. Post a communiqué on its web site explaining in as much detail as they are able what they have done and what they are doing to fix it;

2. Reduce the weight of OOP;

3. If the OOP is indeed a static penalty that can only be removed by a human, change it to a dynamic penalty that is analyzed and assessed with each major update; and

4. Establish an appeal process through which site owners which feel they are following all rules and have quality content can have a human (or enlightened spider) review their site and remove the OOP if appropriate.

When this recent update broke on November 15, webmasters clamored in the thousands to the industry forums such as webmasterworld.com. The mis-update was quickly titled "Florida Update 2003" and the initial common wisdom was that Google had made a serious mistake that would be fixed within 3-4 days and everyone should just stay put and wait for Google to 'fix itself.' While the rankings are still dancing, this fix has yet to come. High quality sites with lots of good content that have done everything right are being severely penalized.

If Google does not act quickly, it will soon lose market share and its reputation as the provider of the best search results. With Yahoo's recent acquisition of Inktomi, Alltheweb/FAST, and Altavista, it most likely will soon renege on its deal to serve Google results and may, in the process, create the future "best search engine on the 'net." Google, for now, has gone bananas in its recent meringue, and it may soon be spoiled rotten.

Chndru

7:32 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>the filter

I think the word "filter" is crudely used here considering you are dealing with a database of billions of documents and that Google returns search results in micro seconds. I am sure, they would remotely use anything that aint automated. And word filtering (from Adwords broadmatch etc..) aint stand the test of the nature of the search queries handled. I am more inclined towards sophisticated text analysis, considering Applied Symantics is under the Google's roof now. Try this: http://azeem.azhar.co.uk/archives/000571.php (no, it's not my link and i aint got anything to do with it)

[edited by: DaveAtIFG at 10:30 pm (utc) on Dec. 4, 2003]
[edit reason] DeLinked [/edit]

Kirby

7:44 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Heavy cross linking is the way to go.

Until a cross link filter is applied.

Kirby

7:58 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I am more inclined towards sophisticated text analysis, considering Applied Symantics is under the Google's roof now

I wish this was the case, because then the focus would be on relevant, written-for-the-user content, but dont think so at the moment. It doesnt explain the weather sites, link directories, the serps for shelving mentioned earlier, or the page about house rabbits that shows up for 'city homes'.

But perhaps they're using unsophisticated text analysis.

steveb

8:19 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Anyone who advocates using -usrtsrse to get the spammy results might as well learn to just type in allinanchor:keyword as their modifier instead, and then start trying to justify how "volume of anchor text is the best algorithm."

After all these posts, nobody complaining about the quality of the serps (rather than a lost site) has had the courage to make the absurd case that the old pre-florida anchor-text-only method of ranking was a good one. No one makes that case because the idea is plainly absurd. Just because you make 1000 more anchor text links to one of your pages does not mean that this page has now magically gotten to be higher quality. allinanchor: or -ysystsr might coincidentally show a good site not ranked otherwise, but the idea that search results should be shown based on volume of anchor text is plainly indefensible.

vbjaeger

8:32 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



Steveb wrote
Anyone who advocates using -usrtsrse to get the spammy results might as well learn to just type in allinanchor:keyword as their modifier instead, and then start trying to justify how "volume of anchor text is the best algorithm."

We have modest variety of anchor text. Using the -sdfds -sfsdf method shows us #4 and using allinanchor shows us at like 70. It's not the same thing.

to get the spammy results

pretty biased opinion here.

steveb

8:51 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"pretty biased opinion here."

Sorry to hear that. Open your mind a little more.

The -systsrs qualifier is just similar to allinanchor, not exactly the same. Some sites with a great deal of other positive factors managed to rank well pre-florida without much anchor text, but there is no denying that for pre-florida competitive searches the results closely paralleled allinanchor.

vbjaeger

8:57 pm on Dec 4, 2003 (gmt 0)

10+ Year Member



I mean no disrespect steveb, I just think your statement was a blanket statement that is not 100% true and is biased because you have unfortunately been subjected to spammy sites. The filter has worked to remove some spammy sites, but some good sites as well, and the -sdf -dsfs demonstrates this better than allinanchor.

For a 3 keyword seach I watch, none of the top 30 sites used any obviously deceptive techniques, no affiliate sites, just clean sites that no longer exist in the top 1000. They are right back into there former position using -ewr -ewwef

edit: just read this

The -systsrs qualifier is just similar to allinanchor, not exactly the same. Some sites with a great deal of other positive factors managed to rank well pre-florida without much anchor text, but there is no denying that for pre-florida competitive searches the results closely paralleled allinanchor

I agree

Kirby

9:06 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>there is no denying that for pre-florida competitive searches the results closely paralleled allinanchor.

True, but I am seeing dramatically different allinanchor results now that I doubt are accurate. The same directory type pages that now show up high are also doing the same for allinanchor.

I have one site that had backlinks updated and the number doubled. Most had same AT. However now this site drops from #2 for allinanchor to #39? I seriously doubt the current accuracy of allinanchor.

steveb

9:23 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I don't see any evidence that allinanchor is either more or less accurate than before. Pre-florida you could add a bunch of links and still inexplicably drop for allinanchor anyway. It seems to me that allinanchor is basically the same thing as pre-florida -- meaning it is now as accurate or innaccurate as it was before.

"I just think your statement was a blanket statement that is not 100% true and is biased because you have unfortunately been subjected to spammy sites"

Sure my main hyper-competitive niche has been awash in spam, and the improvement post-florida is almost inspirational (although still very far from perfect). Also, there is no doubt many good sites have been affected by florida. Two sites of friends of mine are classic examples of how (totally benign) duplicate content can sink a site post-florida, temporarily at least.

However, I can't see how anybody can make the case that the post-florida serps are not less spammy. The least relevant results for any term are now much more likely to be off-topicish (that Iowa shelving page) or directory-ish (link pages). These aren't spammy. These are just not great results. No-content spam still exists, but in volume it is less. If someone wants to make a case that "spam" has been replaced sometimes by "lightweight off-topic", fine. But I for one sure consider that a drastic improvement. Give me that Iowa shelving contruction page any day over a doorway to a doorway of a doorway with 10,000 anchor text links pointing at it from domains owned by the same entity.

Florida, or Galen if you prefer, needs to just get a bit more niche-relevant, and solve the duplicate content problems, and maybe some other ones. The basic results though are less spammy and the root of the algorithm is a dramatic improvement.

Hissingsid

9:42 pm on Dec 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Kirby recently said

Allintitle, allintext and allinanchor results drastically different than pre-florida. Why would that be?

When I did this test on my "disaster" search term the top ten were pretty much per Florida in each case. There is a bit of shuffling but the top ten pre Florida appear there in the top ten when filtered with whichever filter I used.

Take a look at what the filters do to the search here
[google.com...]

and in this discussion [webmasterworld.com...]

Basically each of these filters limits the search to one factor on the page for example Allintitle: looks for the search term in the title. In my market I guess that the top ten are all pretty well optimised for each of the main ranking factors for that particular search phrase. In your market it looks like things are different possibly sites are not so "well" optimised.

Best wishes

Sid

PS I wonder how many of the folks who contributed to this thread are busy eating their words now [webmasterworld.com...]

superscript

10:00 pm on Dec 4, 2003 (gmt 0)



Filters - no - I think the idea should be dropped. It doesn't make sense. It's not Google's style and never has been. It's not an elegant solution, and hints at human intervention.

I honestly think we should be thinking along different lines.

lorenzinho2

12:00 am on Dec 5, 2003 (gmt 0)

10+ Year Member



a comment on outbound links:

pre-florida, there was the perception that outbound links should be used sparingly because they would "leak" pagerank.

post florida, if the directory / authority argument carries weight (which I believe it does - how else can you explain sites placing well in the SERPS solely on the basis of their outgoing anchor text?), it would appear that outbound links to quality, related sites can help your position in the SERPS.

if this holds up, this is a major shift - and one for the better IMHO. it had always seemed odd, and somewhat mean-spirited of google to discourage sites from linking to each other by brandishing the stick of leaking pagerank.

defanjos

12:45 am on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



it would appear that outbound links to quality, related sites can help your position in the SERPS.

Help? It does more than that, it can save you from being dropped to position 400 plus.

brina

12:52 am on Dec 5, 2003 (gmt 0)

10+ Year Member



Just my cents:

I have 4 well ranked sites. Only one of my sites was affected by this latest update. All four sites are very similar in layout and design (we sell our products to 4 different markets).

The site that fell off the face of the earth has a PR 6. It has 176 quality backlinks. It is in a very under-optimized field with very little real competition.

I tried the -dsghhh trick and found that it showed only one of the two most used kw phrases was penalized although my site doesn't show up under either. It was #3 for both last month.

I had a lot of the "penalized" kw phrases repeated on one section of my site. My feeling is my penalization for this particular site was because of that repeated phrase. None of the other sites I have that component.

I have removed the repeated keywords. We'll see what happens next month!

namniboose

1:03 am on Dec 5, 2003 (gmt 0)

10+ Year Member



I think More Traffic Please is onto something in msg #158 on page 11 of this thread.

This is the only idea that has made sense to me in terms of what we know about Google: Google is full of genius and strives for the best possible results.

Google's apparent contentment with the current serps only makes sense if this is just step 1: finding the 'authority sites' (and tweaking the algorithm to get rid of sites that just look like authority sites).

This is the only way I can see for Google to eliminate the affiliate spammers while keeping the good sites. As Make Me Top says:

If this algo is being used then it is its own built in 'spam' filter because nobody that would qualify as an expert would have a link to a site that does not fit their content. Therefore there is no penalty, just a new algo that washes you out of the mix if you don't fit in. And what you are seeing is the middle of the change over.

I also think Google's silence and the timing of this 'update' is no accident: I'm sure they were well aware that a lot of sites will do damage control during this buying season and turn to Adwords (again, they aren't stupid!).

If all this is true, then Google is brilliant. And mean!

cyberprosper

1:22 am on Dec 5, 2003 (gmt 0)

10+ Year Member



I think Freshbot's recently added pages is throwing off those -garbage test searches now.

dazzlindonna

3:02 am on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Straight from the horse's mouth...From Google's own page of guidelines...

Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

It seems as if Google is contradicting itself with this update. The biggest problem right now is that the words users would type to find our pages, are exactly the words that Google is punishing us for having on our site. It is the words that users would NOT normally type to find our pages that Google is not filtering / punishing / ignoring / whatever phrase you want to use here.

Trawler

3:12 am on Dec 5, 2003 (gmt 0)

10+ Year Member



Another News Post on Google.

This one has major implications.

[story.news.yahoo.com...]

Sunset_Jim

3:40 am on Dec 5, 2003 (gmt 0)

10+ Year Member



All of this discussion about the Google Florida update reminds me of the time Coca-Cola tried changing the flavor of Coca-Cola. The backlash was so great they went back to the original formula.

Chndru

3:45 am on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>The backlash
Yeah..that backlash was from the coke-drinkers, but this google backlash is from dozens of SEOs (though they are users themselves). If Google sees that the user's average time/query (or any other quality metric) is getting worse, then they will know they are not delivering what they supposed to be. Not the case, from where i see.

JohnKing1

3:52 am on Dec 5, 2003 (gmt 0)

10+ Year Member



I am one of the webmasters who did well out of Florida.

My website is named "The Widget A and Widget B Society"

My 'Widget A' term has 3.7 million results in Google

My 'Widget B' term has 1.3 million results in Google

Previously I was ranked 78th for 'Widget A' term, and 31st for 'Widget B' term, which is reasonable.

After Florida I am ranked 34th for 'Widget A' term, and 80th for 'Widget B' term. I don't know why this has happened by am happy that I have moved higher in the rankings for a competitive term.

One possible explanation is that Google is using the ordering of the link text, meaning that the first term in the link text is more important than the second term, and the second term is more important than the third term, and so on.

Also, I have completely deleted any copy from the front page of my site, including the introduction message and all text that might give Google the impression of 'keyword stuffing'. My front page is now simply a logo and a collection of links

Has anyone else had similar experiences?

oodlum

4:04 am on Dec 5, 2003 (gmt 0)

10+ Year Member



I think Freshbot's recently added pages is throwing off those -garbage test searches now.

Cyberprosper - do those SERPs you're referring to look any different?

c1bernaught

4:16 am on Dec 5, 2003 (gmt 0)

10+ Year Member



Chndru:

Hmm... must be that everyone else is wrong and you know what's going on.... not.

The tide is turning my friend. It's not just SEO companies that are complaining.

rfgdxm1

4:25 am on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>The tide is turning my friend. It's not just SEO companies that are complaining.

Dream on. I think your reality check just bounced.

skirope

6:47 am on Dec 5, 2003 (gmt 0)

10+ Year Member



I launched my site about two months ago. Within a week my homepage was indexed. In about three weeks google had crawled all my pages (because when I searched for specific phrased in page it came up). Today (or may post-Florida) the only thing in Google is my homepage. Could I be penalized? or is it because of Florida?

We do nothing shady, I have keywords in title, urls are parsed with keyword in parsing. I am keywords in header and then in body. Here is how my links look:

[widgets.com...]

Any can offer any guidance as to why my site only has its homepage in now?

How can I tell if other pages are indexed besides searching for certain phrases within them?

Thanks so much!

anime_otaku

6:55 am on Dec 5, 2003 (gmt 0)

10+ Year Member



only way to tell if you are a victim of florida or if you are really penalised is doing a google search with syntax
site:example.com keywords

replace 'keywords' with what terms you are expecting your 'penalised' pages should show up on. replace example.com with your domain. if you find them, oop penality, if not found at all, you're in trouble. ;-(

darkroom

7:28 am on Dec 5, 2003 (gmt 0)

10+ Year Member



btw...just wanted to let you guys now. Google themselves stated in the email as follows:

"New sites, changes to existing sites, and dead links will all be noted in the course of the next crawl, which will be completed before the holiday season ends."

The detailed version can be found here:
groups.google.com/groups?dq=&hl=en&lr=&ie=UTF-8&group=google.public.support.general &selm=35849812.0312040716.360dd410%40posting.google.com

[edited by: DaveAtIFG at 4:24 pm (utc) on Dec. 5, 2003]
[edit reason] Fixed sidescrolling [/edit]

LateNight

8:18 am on Dec 5, 2003 (gmt 0)

10+ Year Member



>>>>the next crawl, which will be completed before the holiday season ends<<<<

We will see what happens to the ones de-optimizing. Place your bets.

Powdork

8:45 am on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



changes to existing sites, and dead links will all be noted in the course of the next crawl

We will see what happens to the ones de-optimizing. Place your bets.

I was thinking about this back in Esmerominic. What if Google determined who was trying to game the system by detecting changes to a site that coincide and reflect changes in the algorithm. What if Florida is nothing but a giant seo trap.
How's that for paranoid?;)

claus

9:10 am on Dec 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yet another long and windy post...just kidding ;)
The alphabet-stuff i mentioned earlier was just a wild guess. I don't think the alphabet is a likely source for setting any priorities regarding search. Stuff like city size, "popularity" (number of queries, etc) or something like that would be better. This, in turn, is speculation (just to remove any doubt).

Back to the buckshot - it still seems to me that there is more than one component at work here, although i personally have been preoccupied with getting an understanding of the "broad match" as that is the new thing, IMHO (still, for lack of better words). As these threads do discuss all kinds of things, i have compiled a list.

These are my premilinary observations, they might not all be true, and i might have forgotten something as well - also, some of these bullet points may be closely related (or even the same thing). Order does not reflect importance:

  • A different treatment of what is typed into the search box aka. the "broad match" vs. "exact match". Although my focus have been on the "broad", sometimes the search box do broad matches, sometimes it does very exact matches, like, eg. when you enter some product serial number or other highly specific query. (also, it still seems to me that the famous knob has been turned a bit toward pre-Florida style now, and as there has been no new update [webmasterworld.com] this signals that it is clearly an independent factor)
  • An use of "stemming" on the words of the query. This is the one factor that we have confirmed as being new, as it's mentioned on the Google help pages. It is closely related to the "broad match" - in some cases it might even be the broad match, but i'm giving it a separate bullet here, as i'm not convinced that "stemming" would be able to do all that alone (although i recall a post from GoogleGuy saying "it's not your usual stemming" indicating that it can do more than stemming normally would do).
  • A spam-reducing algorithm/feature - steveb mentions "links from same entity", which points toward localrank, others have mentioned hilltop, which could also have some merit. Personally i'm not sure if it's either of these, a combination, or something else. Still, for an efficient spam filter to be in action, it would also have to reduce spam for "exact searches" - although not for that particular subset where you search for the exact spammy site(s) of course. I agree that there is still "spam", but some of it has gone, and in some areas the difference is bigger than in others.
  • Some odd tie-in with the google directory - which is strictly personal speculation and please don't jump on this one, as i know that GG said clearly in one of the "Update Florida" threads (i think it was part 1) that the directory wasn't affected by this update (which seems to be right in the literal sense, and i have no reason to question that he is telling the truth, albeit not always all of it). I do believe that this is coupled to the "broad match" in some odd way. IMHO, AFAIK, etc.
  • A better (or more efficient) handling of duplicate issues - which is related to the spam issue, but not only that, as perfectly "whitehat" sites also face these issues (some affiliate programs also tend to produce near-duplicate pages/sites). It seems to be better merging of domains, better identification of duplicate content across domains and soforth. Still, this issue is a difficult one, and i'd not expect a "dupe filter" to become perfect overnight, but something seems to have improved in efficiency. Specifically, the "vanity domain" issue (having more than one domain pointing at same website) seems to be able to cause some problems if you're not using proper 30X redirects [webmasterworld.com] - it's not a new issue, it's just been highlighted a bit it seems.
  • A reduced emphasis on "fresh" content - with the arrival of deepfreshbot the SERPS were flooded with blogs, discussion forum entries, guestbooks, email-discussion lists, etc. These are now (mostly) gone. (*sigh of relief*)
  • An increased emphasis on "authorities" and "hubs" (and i might add "news sources", although that is somewhat more speculative.) The difference is that authorities have a lot of inbounds and hubs have a lot of outbounds (and usually also a lot of inbounds) - using the term "authorities" for both is understandable, but it's really two different kinds of sites (eg. directories vs. dictionaries).
  • An almost english only update (as in language, not country) - not many reports of significant changes for non-english pages sofar, and the non-english reports could be related to other factors (as could some part of the english ones). Not that all changes have taken place for english searches only - i do dual language searches, and i definitely see changes outside the english SERPS, but it's not the same kind. This leads me to believe that the "broad match" and/or "stemming" is the primary reason for these changes (as Google can hardly implement this very advanced technology for a large number of languages [google.com] in a week. In fact, doing it for the english language alone would be a big task, which is why i'm suggesting that this is under development and hence haven't hit all sectors yet)
  • The usual algo tweaks - these have hardly been mentioned, although normally they would be the focus of the debate. These are the technicalities like linking strategy [webmasterworld.com], factors that influence ranking [webmasterworld.com], and ranking of factors [webmasterworld.com]. Forget about AdWords - that shift in focus is about the most important gain Google has gotten here (although i doubt it will be permanent). Okay, here goes, only minor changes (afaik, imho, etc.): Upping of inurl, in directory description (perhaps directory keyword), in title, intext (including outbounds), first words matter more than last words, PR, anchor text a bit down <tongue-in-cheek>still anyone thinking i post Google ads?</tongue-in-cheek>. So, Brett's 26 steps [webmasterworld.com] still holds - don't let anyone convince you of anything else.

That was about everything i could think of at the moment. I hope you'll find it useful, although i wouldn't recommend you to charge $95 for it. Also, i hope that this might inspire some thoughts so that we can get the rest pinned down - any thoughts?

/claus

This 526 message thread spans 18 pages: 526