claus - 2:12 am on Dec 14, 2003 (gmt 0) Now, technical difficulties...is Google broken, and if yes, where? Imho:
caution, long post:
Thanks, DaveAtIFG for that post #21 and Brett_Tabke for that post #25 - great to see some reason emerge out here
Now, technical difficulties...is Google broken, and if yes, where? Imho:
So, what's all the fuss about?
Apparently, and naturally, this "stemming" / "broad match" / "semantics" thing was started with the (American version of the) English language. And of course it would be naive to think that you would handle all possible queries in this language properly from the start. My best guess is that English English (ongoing, afaik), German, French, Spanish, etc will follow sometime later on. Google is no longer just one search engine, as everyone using dual language searches have known for a long time, and as everyone else really should know by now.
So, that was the thing we (well, i really should speak for myself... imho, fwiw, etc.) didn't see outside US (except for english-language searches, but you get my point i guess) - as for changes in handling of duplicate pages - that was real, and it's gotten more efficient - still doesn't catch all though. I also noted less emphasis on all the "fresh results" - not all that many blogs, forums, emaillists, etc. And some hard-to-define emphasis on "authorities" and/or "hubs" (as well as possibly "news sources", although this one is strange as it's not as in "current news stories").
So even without the "stemming" or whatever, there was a few other ingredients in the soup. The basic ranking criteria as per before florida (mainly anchor text, and the various markup elements on-page) still seems sort of unchanged - a little up here, a little down there, but overall it's not dramatic, it's just harder to decode, as those other things do add some smoke. That's why i keep saying that the basic whitehat stuff still works nicely. In fact, with the added semantics/broad match/stemming i feel more comfortable now than ever before saying "if you simply build the best site for your topic, you will become #1" - then again, that's a lot more labour than running a link campaign. It also implies a slight shift in the work fields for the SEO community, albeit one we've had some good discussions on for some time - less quick fix and more long term.
All that is... provided Google can make it work across the board. It's second to none now for....well, for those searches where it works (no sarcasm intended); "lazer like precision" describes my own experience wery well for some searches - most of those i do "as a searcher" in fact. For other searches, it's just not good enough...yet.
Stemming, as well as semantics is rule-based, i believe. It's not like it's AI or something - at least i personally don't think so. If it was AI (as in "self learning systems", not as in "Hal"), i believe Google would be running a serious risk, and the results would be utterly useless for a decade or so. Still, when working with rule-based stuff you have to make sure you have a large set of rules, as there will always be special cases. So, this is a rollout, we're not seeing the full-blown version yet, but we seem to be very far in the process for the American subset of SERPS, and the missing items are probably being worked on. This "broad match" mode should be expected to propagate throughout the whole Google system eventually, imho.
Enough of that babbling... five things to consider or input for discussion or whatever. Everything is "afaik, fwiw, imho" as usual:
I'm not going to bore you anymore with this right now. I don't really feel there's anything in this post that i haven't stated elsewhere, but perhaps the headline "a fresh look" or those two other posts gave me the energy to write it all (i hope) a bit clearer.
Just to remove any doubt; this is all "for what it's worth", "as far as i know", and "in my humble opinion".