homepage Welcome to WebmasterWorld Guest from 23.22.194.120
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 260 message thread spans 9 pages: < < 260 ( 1 [2] 3 4 5 6 7 8 9 > >     
Google's Florida Update - a fresh look
We've been around the houses - why not technical difficulties?
superscript




msg:212181
 10:20 pm on Dec 12, 2003 (gmt 0)

For the past four or five weeks, some of the greatest (and leastest) Internet minds (I include myself in the latter) have been trying to figure out what has been going on with Google.

We have collectively lurched between one conspiracy theory and another - got ourseleves in to a few disagreements - but essentially found ourselves nowhere!

Theories have involved Adwords (does anyone remember the 'dictionary' concept - now past history.)

And Froogle...

A commercial filter, an OOP filter, a problem caused by mistaken duplicate content, theories based on the contents of the Directory (which is a mess), doorway pages (my fault mainly!) etc. etc.

Leading to the absurd concept that you might be forced to de-optimise, in order to optimise.

Which is a form of optimisation in itself.

But early on, someone posted a reference to Occam and his razor.

Perhaps - and this might sound too simple! - Google is experiencing difficulties.

Consider this, if Google is experiencing technical difficulties regarding the sheer number of pages to be indexed, then the affected pages will be the ones with many SERPs to sort. And the pages with many SERPs to sort are likely to be commercial ones - because there is so much competition.

So the proposal is this:

There is no commercial filter, there is no Adwords filter -Google is experiencing technical difficulties in a new algo due to the sheer number of pages to be considered in certain areas. On page factors havbe suffered, and the result is Florida.

You are all welcome to shoot me down in flames - but at least it is a simple solution.


 

Bobby




msg:212211
 9:18 pm on Dec 13, 2003 (gmt 0)

Thanks Brett, I appreciate your well thought out answer as do the others undoubtedly.

The Florida update has actually been good in many ways (though it has hit ME hard personally in my very competitive category. It has forced me to think in other ways regarding how best to present my web site to the public. I have also turned to using other search engines where until late November I only used Google.

While Google's depth is unmatched by any other engine (although ATW comes in a close second) it has speed on its side. Unfortunately the quality of results has suffered in my opinion because of this new filter or algorithm or whatever we want to label it. I say this based on MY searches in the last month as a user and NOT as an SEO.

Google feels it is full speed ahead

Definately, or else they would have remedied the situation when they had time. Now they are in it for the long haul.

While deep down in our hearts we all still love Google (even those of us who have suffered losses) I think we (SEOs) are in a similar position to parents whose children have grown up, and can no longer mold them into what we want them to be.

theitboy




msg:212212
 11:09 pm on Dec 13, 2003 (gmt 0)


Something I just realized, in relation to the semantic analysis theory: it could explain the seeming-increase in importance of outbound link text.

Why? Because a logical algorhythm would not attempt semantic analysis on outbound link text - outbound links are rarely properly formed sentences, subject to analysis - it's something that google would expect to naturally be "keywordy".

Think about it.

kaled




msg:212213
 2:11 am on Dec 14, 2003 (gmt 0)

Why? Because a logical algorhythm would not attempt semantic analysis on outbound link text - outbound links are rarely properly formed sentences, subject to analysis - it's something that google would expect to naturally be "keywordy".

So the new spam will be keyword stuffing of links - that's fandabidozi.

Brett,

Either a page is spam or it is not - it does not depend on the search terms. Spam pages/sites should be removed from the index entirely not removed from the SERPS by an ad-hoc filter whose parameters vary with search terms.

Duplicate pages within a site should be ignored. Duplicate pages across sites need human intervention before one is arbitrarily removed (possibly leaving the original stolen copy in the index).

Duplicate domains can be detected by algo and be removed - SERPS would improve if Google did this.

One problem with dynamic-spam-filtering/OOP/Bayesian is that it necessarily introduces more discontinuities into otherwise clean (hopefully) algos. By discontinuities I mean lots of if..then..else instead of a*b + c(d + e)

It would appear that Google have gone down this path. This tells me that no one at the plex has a degree in Control Theory.

A few years ago, a minimum wage was introduced in the UK amid much fear it would cause unemployment. However, the value was such that the impact was small. Similarly, if Google set their discontinuities at silly levels they may do no harm but that is the very best that can be said of them.

If this is the future of Google then Google has a bleak future - a pity since I do believe their intentions are good. But it does go to show that dummies in high places are very dangerous.

Kaled.

claus




msg:212214
 2:12 am on Dec 14, 2003 (gmt 0)

caution, long post:
Thanks, DaveAtIFG for that post #21 and Brett_Tabke for that post #25 - great to see some reason emerge out here

Now, technical difficulties...is Google broken, and if yes, where? Imho:

  • The Directory - that one's a mess at some places. Not totally, but some cats seems to be thinking they are somewhere else than they really are.
  • Google Search - nah, not really, it just works differently now than before this thing started. There are of course errors, but the majority of "deviations from pattern" seems to me more like, say, omissions ("we haven't been there yet") than real errors.

So, what's all the fuss about?

Apparently, and naturally, this "stemming" / "broad match" / "semantics" thing was started with the (American version of the) English language. And of course it would be naive to think that you would handle all possible queries in this language properly from the start. My best guess is that English English (ongoing, afaik), German, French, Spanish, etc will follow sometime later on. Google is no longer just one search engine, as everyone using dual language searches have known for a long time, and as everyone else really should know by now.

So, that was the thing we (well, i really should speak for myself... imho, fwiw, etc.) didn't see outside US (except for english-language searches, but you get my point i guess) - as for changes in handling of duplicate pages - that was real, and it's gotten more efficient - still doesn't catch all though. I also noted less emphasis on all the "fresh results" - not all that many blogs, forums, emaillists, etc. And some hard-to-define emphasis on "authorities" and/or "hubs" (as well as possibly "news sources", although this one is strange as it's not as in "current news stories").

So even without the "stemming" or whatever, there was a few other ingredients in the soup. The basic ranking criteria as per before florida (mainly anchor text, and the various markup elements on-page) still seems sort of unchanged - a little up here, a little down there, but overall it's not dramatic, it's just harder to decode, as those other things do add some smoke. That's why i keep saying that the basic whitehat stuff still works nicely. In fact, with the added semantics/broad match/stemming i feel more comfortable now than ever before saying "if you simply build the best site for your topic, you will become #1" - then again, that's a lot more labour than running a link campaign. It also implies a slight shift in the work fields for the SEO community, albeit one we've had some good discussions on for some time - less quick fix and more long term.

All that is... provided Google can make it work across the board. It's second to none now for....well, for those searches where it works (no sarcasm intended); "lazer like precision" describes my own experience wery well for some searches - most of those i do "as a searcher" in fact. For other searches, it's just not good enough...yet.

Stemming, as well as semantics is rule-based, i believe. It's not like it's AI or something - at least i personally don't think so. If it was AI (as in "self learning systems", not as in "Hal"), i believe Google would be running a serious risk, and the results would be utterly useless for a decade or so. Still, when working with rule-based stuff you have to make sure you have a large set of rules, as there will always be special cases. So, this is a rollout, we're not seeing the full-blown version yet, but we seem to be very far in the process for the American subset of SERPS, and the missing items are probably being worked on. This "broad match" mode should be expected to propagate throughout the whole Google system eventually, imho.

Enough of that babbling... five things to consider or input for discussion or whatever. Everything is "afaik, fwiw, imho" as usual:

  1. theitboy is right about outbound links being "exact matched" and not "broad matched" - and the same goes for inbounds. At this moment. So, will that last? possibly maybe, and your guess is as good as mine... Now, as anchor text is still king, where does that leave your very optimized site when the query is "broad"?
  2. Also, Brett is absolutely right that some specific kinds of pages are simply not very suitable for an SE due to the absense of large amounts of text. Shopping carts are prime examples. This is true for any Search Engine, but earlier this could be taken care of by getting a truckload of inbounds, now we have the broad match.
  3. Google is not just one Search Engine. First, you treat the query in one way (trying to make some sense out of it) then you treat the index in the same way (trying to make sense of the pages), but you still need to be able to do exact matches on the index. So, that's two-in-one (let's omit parcels, airports, definitions, synonyms, calculations, etc. for the moment, but don't forget they're there). Multi-layered search, i'd say.
  4. Google is not just one Search Engine. Now, they're rolling out this broad match thingy, but of course they're not totally finished with that (if they'll ever be). So, what do they do when there's no broad match yet for some query? Use other methods to rank results, of course. The pre-florida stuff. Or, suggest a selection of alternatives to choose from. Multi-layered search, i'd say.
  5. Google is not just one search engine (i bet you guessed it). So, just like the above - when a page hasn't got sufficient textual content to pass successfully through the broad match matchmaker...what do you do? Use other methods to rank results, of course. The pre-florida stuff. Or, suggest a selection of alternatives to choose from. Multi-layered search, i'd say.

I'm not going to bore you anymore with this right now. I don't really feel there's anything in this post that i haven't stated elsewhere, but perhaps the headline "a fresh look" or those two other posts gave me the energy to write it all (i hope) a bit clearer.

Just to remove any doubt; this is all "for what it's worth", "as far as i know", and "in my humble opinion".

/claus


Added, per kaleds post: I agree, you should be careful with such stuff. As a minimum by having backup systems for when the rules fail or cannot be employed and such. Perhaps they've underestimated the amount of "seemingly non-content" in competitive sectors?
theitboy




msg:212215
 3:19 am on Dec 14, 2003 (gmt 0)

OK. I'm going to take a run at what might be a nice, "unfiltered" page for the new google, based on the ideas in this thread.

1. It will have at least a few paragraphs of good, clean, meaty prose. The prose should be written in clear English, with simple and clean sentences and paragraph breaks.

Active voice constructions should predominate; avoid the passive stuff. You should avoid complex sentence structures, lists of things, etc.

Content should predominate over anything that might be interpreted by a robot as "spammy", i.e. menu lists, keyword lists in alt images, etc.

Outbound links should either be concise and to the point, or coherent sentences. Be careful where and how you link.

Have a good variety of inbound links, a natural-seeming spread.

In the end, it's all kind of old-school stuff, really.

GranPops




msg:212216
 2:32 pm on Dec 14, 2003 (gmt 0)

There are occasional posts saying, “ I am new, what do I have to do to get on first page of Google?”

Over half a century ago, as students we were given the following answer to any question which asks for a one sentence answer to explain what has been learned after 5 years of study.

There are 3 possibilities:-

If you are asking me to explain it in 1 hour, I need a month to prepare it.

If you will allow me one full day to explain, I need a week to prepare it.

If you will allow me a month to explain it, I can start now.

How true with SEO and Google.

My penny worth…………
Follow Brett’s list, and read, and more importantly digest, what the senior guys like Claus and CIML write.

Then a year after practising that advice, come back and ask the same question, but this time omitting the “I am new” bit.

I am now 8 months into this hobby, and am like a dog with two d---s with the results of their advice.

137 new No.1's as a result of Florida.

GranPops

superscript




msg:212217
 3:02 pm on Dec 14, 2003 (gmt 0)

Hi Granpops,

A couple of points: my 'hobby' sites are also doing fine after Florida. But my commercial sites are suffering. But all of my sites are constructed in much the same way. The issue, in case you hadn't noticed, is largely concerned with commercial sites.

As for feeling like 'a dog with two d**cks' - an unfortunate expression. Rather like owning several cars, you can only use one at a time ;)

Mardi_Gras




msg:212218
 3:05 pm on Dec 14, 2003 (gmt 0)

>I am now 8 months into this hobby, and am like a dog with two d---s with the results of their advice.

granpops - I am very happy for your success, and appreciate your sharing the lessons you have learned in nine months of webmastering. I will also share something I have learned over the years - the more I learn, the more I realize I don't know :)

Best wishes for continued good fortune in the Web world.

Don

borisbaloney




msg:212219
 3:05 pm on Dec 14, 2003 (gmt 0)

Google didn't buy Applied SEMANTICS because it tasted great and was less filling.

Hmmm some people here might call that a conspirosy theory ;)

But seriously, for those of us who are not English majors - are the following reasonable assumptions when semantics are applied to search algorithms?

1. Because sentence structure analysis requires a complete sentence, your paragraphs and your description would probably be analysed instead of headings/titles.

2. Excessive use of adjectives/nouns would be a likely "flag" because they are normally keywords.

Any thoughts?

Hissingsid




msg:212220
 3:17 pm on Dec 14, 2003 (gmt 0)

But seriously, for those of us who are not English majors - are the following reasonable assumptions when semantics are applied to search algorithms?

Hi Boris,

Don't assume that semantics = semantics as you know it.

Go to the appliedsemantics.com web site and have a read about Google's domain park. Any technology that can "understand what a domain name means" and build pages of suitable advertisements on the fly based on a match between the search term and the meaning of a domain name is well capable of doing everything that we are seeing and more.

Think about how they are going to generate traffic for those parked domains.

How they are going to optimise the pages in those domains so that they appear high in SERPs.

Now try and convince me that this will lead to better search engine results in the long term. What is being planned might be seen as taking the concept of canned chopped ham to a whole new level.

Of course I wouldn't like to say that ;)

Best wishes

Sid

PS I wonder how long this thread will last now!

superscript




msg:212221
 3:29 pm on Dec 14, 2003 (gmt 0)

2. Excessive use of adjectives/nouns would be a likely "flag" because they are normally keywords.

Agreed, but Google's published advice to Webmasters still suggests that it is sensible to include your relevant keywords in your pages. The major advice is actually against using inappropriate words!

The problem with writing flowing English, which is obviously good practice in paragraphs, is how does it apply to a meaningful <title>? A title is by definition condensed and to the point. In Google it has to be written within the constraints of a very limited number of characters. Few would argue that 'We sell furry blue widgets' would not be, within this context, perfectly appropriate use of English (indeed good English) if this is what the page is about.

e.g. A book about 'Unusual Canine Anatomy' is likely to be titled by the editor/publisher: Unusual Canine Anatomy: A Study of Unusual Anatomy in Dogs. I hope that the new algo isn't suggesting it would be good English to title it: Welcome to our book! Here is a book we wrote recently that is all about anatomy, of the unusual kind, with specific references to our furry canine friends, such as Fido, er... - it ain't appropriate English, and if this is the way we are expected to title, it doesn't make any sense.

p.s. Thanks to GrandPops for inspiration regarding the imaginary book title ;)

[edited by: superscript at 3:44 pm (utc) on Dec. 14, 2003]

subway




msg:212222
 3:39 pm on Dec 14, 2003 (gmt 0)

I like the depth and speed of alltheweb, but they never had the algo to give the laser like precision that Google still gives. - BT

I agree, but I think ATW will accomplish "laser like precision" before Google accomplish their desired outcome of this illicit Google / Froogle marriage.

mil2k




msg:212223
 4:27 pm on Dec 14, 2003 (gmt 0)

I am now 8 months into this hobby, and am like a dog with two d---s with the results of their advice.

137 new No.1's as a result of Florida.

Great to hear that GranPops. But just for a moment give a benefit of doubt to the persons asking these questions. While you might have seen some newbie posts, let me assure you webmasters with years of experience are also clueless for the current filtering of commercial results.

I_am_back - I'm glad that in your five days as a member here ...

Mardi_Gras i think his previous nick was different.

I (and many others) have posted what is probably needed on this forum time and time again.

I don't think we have heard from you or any others any advice to get the site ranking back on competetive keywords. Yes you have been saying about writing good content and let the search engines naturally rank your site.

As far as the commercial filter goes there is no doubt in my mind that Google has changed the way they rank commercial sites.

As to those who are looking forward to the new challenge, one quote (by DaveN) to draw inspiration from

It's only an Algo
;)

superscript




msg:212224
 4:53 pm on Dec 14, 2003 (gmt 0)

I called it a fresh look. Here's a fresher look:

Is Google broken? - not exactly
Is a filter in place? - yes
Is it a commercial filter? - surprisingly, no!
Is it a faulty filter? - yes

----------------------------

Why the change of heart?

It's not exactly a change of heart, because it's not Google that is faulty, it's the filter. Google has applied a new and sophisticated filter. It may be of the Bayesian-type. But what all these advanced filters have in common is that they need training. Humans need to feed the filter with data - with examples of good practice, and bad practice. If it is a spam filter, it needs to be given examples of spam, and also examples of non-spam.

As in all data analysis, whatever the quality of the algo, rubbish in = rubbish out.

Advanced filters fail when they are given an insufficiently large data set - 3+ billion pages requires a very large sample data set.

And what is fed into the filter still requires human judgement (make no mistake - Google doesn't like direct human input into the SERPs, but it is clear from their statement about spam reports that human input is entered to modify algos.)

Bayesian filters are also in danger of failing (recording false positives) if they are not biased 'against' false positives, rather than towards them (for example, in my own Baysian spam e-mail filter, I have it biased against deleting an e-mail which could be potentially important.)

But the initial data feed is crucial. It has to be both statistically significant, and the human judgement that goes into it must be unbiased. It is not difficult for a single user to decide fairly unambiguously what he/she regards as e-mail spam, and what is not e-mail spam.

But if such judgements have been made by a team regarding the vast content of the Internet - with its spam sites, sex sites, academic sites and commercial sites - monitoring the quality of these judgements would have to be done extremely carefully. Indeed, such a judgemental task may be impossible!

A poorly fed, and unintentionally biased Bayesian filter could explain a great deal (and a bias towards academic sites would be understandable given the intellectual quality of Google's employees.)

Commercial sites are not spam sites per se, but they have certain attributes in common. They are likely to share some characteristics such as word repetition - some a consequense of SEO, but some unavoidable if you sell many versions of the same product!

A poorly trained Bayesian filter could easily mistake one for the other - a commercial site for a spam site - based on unintentional bias in its training, and a small data sample.

If such a filter is in place, dispassionately speaking (as someone who has lost his top positions) it probably hasn't done a bad job statistically as a filter. But it has ruined the SERPs for many significant search terms. As such, although it might work on paper - it appears to have failed.

caveman




msg:212225
 5:33 pm on Dec 14, 2003 (gmt 0)

In retrospect, it's clear to me that Dominic/Esmeralda (Domerlda?) was about AdSense. From the AdSense FAQ, "We go beyond simple keyword matching to understand the context and content of web pages." That sure sounds like "semantics" to me. Domerelda was not a "traditional dance." I'm convinced it was all one big update done primarily to facilitate AdSense and semantics.

We have reached the identical conclusion. If true, this explains precisely the shift we have seen recently in ROI, which has been worse overall, but not in all cases.

I'd be careful about concluding that all pages need to be written predominantly in plain English paragraphs.

First it's doubtful that G would implement something that only accepted one primary 'style' of site.

Second, there is an abundance of evidence that sites with alternate sturctures are doing exceedingly well; for example, directories with an abundance of related keywords offered as lists/links. And this seems to make sense relative to the semantics and broad matching speculations-suppositions-opinions-theories-conclusions-posits-ideas-statements-asssertions... :-)

customdy




msg:212226
 7:19 pm on Dec 14, 2003 (gmt 0)

137 new No.1's as a result of Florida.

GranPops

How many of those 137 are commerical sites that have on-line shopping carts that are indexed by google?

I find a very strong relationship that in our industry the commercial sites that are still doing well do NOT have on-line shopping cart or if they do the shopping cart page is not indexed. Of the new top 10, not a one has a shopping cart that is index - they either have no on line order form or the form page is not indexed by google.

Interesting..........

Hissingsid




msg:212227
 8:41 pm on Dec 14, 2003 (gmt 0)

137 new No.1's as a result of Florida.

I would be very impressed if you could rationalise what had changed in the way Google assesses your pages that has given rise to such extraordinary luck.

I guess however that you have no more idea about what gave you your good luck as those of us who have suffered bad luck. We are all in the same little fleet of boats being blown at the whim of the Google wind. Just because you have a big sail this week and get to the front of the fleet don't necessarily think that your boat will not be capsised if a squall blows up.

Best wishes

Sid

Remember even a dog can only lick one at a time!

Hissingsid




msg:212228
 9:27 pm on Dec 14, 2003 (gmt 0)

Hi Claus,

Re your last mega post.

I was following all of the stuff about AppliedSemantics, broad match etc and I was hoping that you were narrowing things down to a simple explanation and then BAM! we got 5 more things to take into consideration.

I feel a bit like I've got to the end of a good novel with a very bad ending, kind of dissatisfied.

Can we discuss what broad match searching means in terms of what has happened post Florida please.

My understanding of it is this: For a two word search term lets say widget tires (good example? maybe because in the UK we spell it tyres) I'm sure that some of you will be trying it with and without allinanchor because it's just too tempting. Interesting eh!

Semantics builds a list of synonyms, antonyms, plurals, brand names etc associated with each of the terms and then a list of all of the possible combinations of those terms and then uses those to do the search. I don't think that it is creating an exhaustive list at present just one that suits its purposes. It might only be using broad terms in Adsense ads on sites that it therefore wants to point you towards.

However limited that search list is it is still going to make the process of ranking more complex than a simple search so lets limit it to say 1000. Now lets drop all of the algo factors that have been open to abuse in the past and use a more simple algo.

Just suppose the purpose of this was to give Adsense and Domain Park a leg up. What would you do to give them a bit of help?

I'll tell you what I would do. I would base rank on just two factors. 1. Body text which just by chance contains Adsense ads that I'm now crawling and indexing and 2. PageRank well work it out for yourself.

Those conspiracy theorists who said it was all linked to high dollar Adwords might have been right but it is just once removed.

Have a read Robert Charlton’s post here [webmasterworld.com...]

What I'm wondering is whether this is a bug or something Google is going to be continuing to do as part of whatever it's doing. It certainly does/did give high PR pages carrying AdSense ads a shot at boosting their rankings for those "filtered" phrases. Or, it might mess up some pages by increasing the exact repetitions of a phrase.

Also see info about Google’s domain parking on the Applied semantics web site.

If I was the manager of Adsense and DomainPark and my revenue was being reduced by spam, duplicates and hyphenated-generic-domain sites I think that I would be delighted when I found that implementing broad matching with the new algo meant that many of these became irrelevant. Its a double whammy!

Best wishes

Sid

PS even if I'm right I don't have a clue what to do about it.

Jakpot




msg:212229
 12:59 am on Dec 15, 2003 (gmt 0)

Remember even a dog can only lick one at a time!

Even a 3 legged dog.
Regards google and what to do - I feel like a one legged man at a butt kicking contest.

kaled




msg:212230
 2:26 am on Dec 15, 2003 (gmt 0)

Regards google and what to do - I feel like a one legged man at a butt kicking contest.

A thousand words of vaguely relevant waffle on each vital page will probably help. However, if you have 1000 pages that's an awful lot of waffle.

I can see the adverts now for waffle generators. Hmm, perhaps I should write one. Or perhaps Applied Semantics/Google will provide an online service for 10 cents a word.

Kaled.

markis00




msg:212231
 4:29 am on Dec 15, 2003 (gmt 0)

Like so many other webmasters I have become greatly annoyed with how unstable the G search engine really is. There are just too many times when "Google is broke," and this is the icing on the cake. I've begun to optimize for Inktomi - why? Because Yahoo is dropping Google and switching to Ink in June. Or so I saw on another thread.

I would suggest the same to any other webmaster. The problem is, I won't recieve any traffic until then ;(

Rick_M




msg:212232
 5:49 am on Dec 15, 2003 (gmt 0)

You can't imagine on how many threads since Florida started that I typed up a reply, and then decided to cancel - I wonder if I'll make it through this time to submit -

1. This thread is very interesting to me, because it brought up a lot of theories as to what Google is doing. There are a few I've considered (based on my own readings here and my own results) that I thought I'd throw in to fan the flames

2. Is Google broke? Well, Brett did say that 2 previous updates were rolled back. I didn't realize that. I just remember a lot of discussions about "more data" being folded into the results. I thought they just needed to get the complete data set before the results started to look good. I myself did not get hurt much during dominic, but then I got some weird results in Esmerelda that made the whole semi-penalty thread of interest to me. I made some changes related to my theories on the semi-penalty, and designed my newer pages with those ideas in mind, and interestingly, I didn't get hurt much during Florida. In fact, the few specific pages where I did get hurt have given me some interesting food for thought.

3. One thought is the idea of "organic links". What did Googleguy mean? I know I originally thought of link swapping and guest book spam as inorganic. I think there may be some type of penalty for inorganic looking links. I think that this filter may be picking up some types of common organic links as mistaking them for inorganic.

4. Applied semantics is probably a part of this update as most people suspect - I make the conclusions purely on the fact that singular and plural forms of words have now been combined. Why did it take Google so long to do this? I dunno. Just seems an interesting coincindence.

5. Around Dominic there was an issue with domain ownership changing hands having an effect on incoming link value. I find it interesting that Google was looking at timing of when links were added (if my memory serves me correctly and I interpreted the information properly when I first read it). It would be interesting if Google used the number of new links added during a time period as some measure of "inorganic". This is purely my own theory and I have no reasons to believe this is being done or would even be that useful.

6. I myself think that Google is broken for many types of searches. Adwords fills the void for most people, which is a nice side effect for Google. I find that I have to add more specific keywords when searching, and then I occasionally come up with nothing. I think this is more a result of the way the web is developing and the pressure it has put on search engines to try to filter out the crap.

7. My listings were minimally effected by Florida, and I am in some areas where I know the adwords bids are high. I had been hurt in two areas after Esmerelda and made some de-optimization attempts, but who knows how much that matters and helped me this update.

8. I have concerns about the web because of the way it is developing in commercial areas. I know the things I'm doing to make money (effectively for today, who knows for how long?) and while some of the things are definitely good content for people (which is where I started), some of the things I do are not really good content - yet they make money. I'd be crazy not to do it (does everyone have a price? the stuff I'm doing isn't really that bad - my price is pretty high).

9. The reason I've probably never made it to the submit button is because I know I've sent some search queries to Googleguy and mentioned my username here. Now I'm basically admitting I'm doing things that are not good for users - what am I, nuts?

10. My own theory is that Google started with a phenomenal concept. Webmasters have learned about it and started to "game" the system. As a result, Google needs a better mousetrap. As the traps get more advanced, and the mice get smarter, the system starts to get much more complex. People start trying to make "inorganic" things look "organic", they may succeed for a while, then might get caught. Someone's totally organic stuff may get caught as appearing inorganic. Enough rambling on this - I get a headache thinking about anything computer related as being "organic"...

11. The bottom line right now is money. Reading old posts about Google's founders thoughts on advertising is enlightening. Essentially the idea was to not include any type of advertising that compromises the search engine, either the appearance, or the integrity of the results. With an IPO around the corner...

12. Does everyone have a price where they are willing to sell out? At this point, would Google's founders be willing to take all of their adwords off the site for a day, to show they are not concerned about the money they are making? That'd be crazy (but could be put into a very positive spin - especially if instead they said they were donating all their money for the day - then it is good business because they get a tax write-off if the corporate laws are similar to individual laws - then it makes good press - rambling, rambling). If Google really wasn't concerned about the money, they'd forget the IPO completely - but then that'd be crazy. Just as I know that some of my techniques are only working temporarily until a better mousetrap is built, I also think it's hard to imagine Google maintaining the number 1 spot for more than a few years. If I were them, I'd cash in on it now too, before they become a netscape in the history of the internet

13. On that futile sentiment, I may as well hit submit - I still kept my day job (not computer related), and what's a few months of extra internet income give or take?

Sorry for going off topic - (debating if I should hit submit or the upper right corner and go to sleep?)

danny




msg:212233
 6:23 am on Dec 15, 2003 (gmt 0)

If it were not for all the uproar, I would not have known -- and still can not tell from simple usage -- that there even was an update. The general results are as good, if not better than ever.

I concur -- for the searches I do, Florida was a non-event. The results are far from "perfect" (in the eye of the beholder), but they're no worse than they were a couple of months ago.

allanp73




msg:212234
 11:23 am on Dec 15, 2003 (gmt 0)

I like hearing these theories.
So far hilltop and Semantics are the to best. I feel that optization and the other theories lack any proof.
The one thing that I would like to see addressed is the comparison of filtered to non-filtered terms. The noticed that some terms are definitely filtered where others are not.
For example:
1)Dallas condos (filtered)
2)Houston condos (not filtered)

1) serps are dominated by directories and non-relevant sites. It has four sponsored listings. Strangely "Dallas condo" is not filtered. Sites rank well because they link to relevant sites or have keywords in outbound links.

2) serps have relevant sites which look very pre-Florida. It has sponsored listings, too. Sites achieve rankings based on the basics of search engine marketing. Good content, keywords in title, pr, etc.

I would like to know if anyone has any luck getting back into the serps after they had been filtered out. I did notice one of my sites rise to the #38 spot recently after adding outbound links but still this isn't as good as the #3 spot it had before Florida.

kaled




msg:212235
 11:58 am on Dec 15, 2003 (gmt 0)

Perhaps Google would like to add these definitions to their dictionary.

Overt filter [Search Engine]
An algorithm that witholds results from users.
1) Automatic : Triggered automatically but a notice is displayed allowing the user to repeat the search without the filter.
2) Manual : The user must explicitly activate the filter using special characters or a selection of switches.

Covert Filter [Search Engine]
An algorithm that witholds relevant results from users.
Covert filters may be employed to cover up a poor underlying algorithm or to increase profits by inducing users to click on sponsored links/adverts.

Kaled.

Kennyh




msg:212236
 12:11 pm on Dec 15, 2003 (gmt 0)

superscript- A Bayesian filter biased against false positives would certainly explain a great deal of what I'm seeing. ie unbelievable spam ranking highly for competitive keywords.

However, in the time that Florida has been running I've seen no improvement in terms of the filter recognising spam and eliminating it. In fact in some cases, more spam is getting in. In th eone case where on eof my pages has improved in the last few days, it's at the expense of a couple of Amazon store pages rather than the spam above it.

As for Brett's view that this a good update, I guess that depends on your definition of a 'good update' I see way too much spam and way too many shopping comparison sites clogging SERPS to consider what we have at the moment to be 'good'

Could someone who disagrees please explain to me the value of having the first page of SERPS for a keyword which starts with sponsored results, followed by Froogle results, followed by five or six shopping comparison sites and a couple of spam results, and a full page of adwords ads?

kaled




msg:212237
 3:17 pm on Dec 15, 2003 (gmt 0)

There are two major problems with learning algos.

1) They have to be taught by humans.
2) They make mistakes while they are being taught.

Google have stated that they prefer to filter out bad sites by algo but they will still have to teach learning algos how to work by providing them with examples of what is good and what is bad.

Learning algos have a place in this world, but they are not a panacea.

Kaled.

superscript




msg:212238
 3:25 pm on Dec 15, 2003 (gmt 0)

Hi Kaled,

My thoughts exactly - the filter needs to be given many examples of the 'good' as well as the 'bad.' If the number of examples is too small, or the sample sets are subject to (entirely unintentional) skew, some strange effects may occur.

I suspect many legitimate commercial sites are being mistaken for spam sites by a faulty* filter, it looks like a conspiracy, but it isn't - I posted as such earlier.

* the term 'faulty' needs clarifying - the workings of the filter, even the algo, may be fine. But if the data submitted to it is insufficient, or skewed, it will produce unsatisfactory results.

Brett_Tabke




msg:212239
 3:55 pm on Dec 15, 2003 (gmt 0)

> hilltop

Have you actually read that document? While the tag line "based on expert documents", it is not a road map for a new algo. It is built on the premise that you can id "expert documents" - That is the same house that PageRank lives in, and we see what direction that is going.

Chndru




msg:212240
 4:06 pm on Dec 15, 2003 (gmt 0)

>hilltop
And what is a hilltop? An animal? Can someone link to it?

<added> found it -http://www.webmasterworld.com/forum3/20314.htm with Brett's collection of papers

Hissingsid




msg:212241
 4:23 pm on Dec 15, 2003 (gmt 0)

My thoughts exactly - the filter needs to be given many examples of the 'good' as well as the 'bad.' If the number of examples is s very sophisticated but the effects that I've seen could be as easily explained by simply switching off allinanchor for sites that match a pre-determined search term.

Perhaps we are giving them (the Google engineers) too much credit.

I'm still floundering about trying to rationalise the motivations and the methods being employed. There are two main motivations that I think could be at the bottom of all of this.

1. Overcomming a problem which would otherwise affect the floatation.

2. Pushing advertising products into growth and profitability so that a major opportunity can be demonstrated prior to public floatation.

Or a mixture of the two.

Problem
The problem is the ease with which results can be manipulated by allinanchor link text. All of those articles about Google Bombing the term miserable failure ranking George Bush's resume on the White House web site #1. Must make investors nervous if the Google bubble is being filled with mischievous blogging when will it pop? might be a question in an investors mind.

Simple Solution
If Google wanted to overcome this problem in the commercial areas that really count then they could simply remove allinanchor from the algorithm for "protected terms" and Google Bombing would be defused for those terms.

Opportunity
If Google wants to maximise the opportunities that it has bought with AppliedSemantics, broad matching of advertising, domain parks delivering Adsense ads then it can only do this if it reduces all of the crap floating about by virtue of the ease with which sites could be pushed up SERPs by manipulating anchor text.

So if you remove allinanchor from the equation in search terms with commercial value you address all of these things at the same time. You target this algo shift at the areas most open to abuse while at the same time opening up an opportunity for yourself in these same areas which co-incidentally are also the terms with the biggest ad spend.

It doesn’t have to be more complicated than that, but it probably is.

Best wishes

Sid

Just closed the quote box

[edited by: Hissingsid at 6:07 pm (utc) on Dec. 15, 2003]

This 260 message thread spans 9 pages: < < 260 ( 1 [2] 3 4 5 6 7 8 9 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved