Forum Moderators: open
"Any clue as to the possible role greater reliance on semantics is playing in your never ending quest for more relevant results?"
I'd say that's inevitable over time. The goal of a good search engine should be both to understand what a document is really about, and to understand (from a very short query) what a user really wants. And then match those things as well as possible. :) Better semantic understanding helps with both those prerequisites and makes the matching easier.
So a good example is stemming. Stemming is basically SEO-neutral, because spammers can create doorway pages with word variants almost as easily as they can to optimize for a single phrase (maybe it's a bit harder to fake realistic doorways now, come to think of it). But webmasters who never think about search engines don't bother to include word variants--they just write whatever natural text they would normally write. Stemming allows us to pull in more good documents that are near-matches. The example I like is [cert advisory]. We can give more weight to www.cert.org/advisories/ because the page has both "advisory" and "advisories" on the page, and "advisories" in the url. Standard stemming isn't necessarily a win for quality, so we took a while and found a way to do it better.
So yes, I think semantics and document/query understanding will be more important in the future. pavlin, I hope that partly answers the second of the two questions that you posted way up near the start of this thread. If not, please ask it again in case I didn't understand it correctly the first time. :)
well obviously if some places are seeing 64 on www then it hasnt moved zero, it has moved some, just not in your area............
Sorry, but that's not how it works at all. They've taken the datacenter dns away to lend credence to arguments like the one you just made, which any senior member of this forum will tell you hold no water.
Scroll back to msg. 147 this thread (quoting GG from Brandy2) where the forecast was changed and tyhe advice was to anticipate a rollout over multiple days.
Comparing 64 index topre-florida
For my fav kw1 kw2 combo, 64.x.x is most definitely 'not' pre-florida serps, but that'sfine. Looking around with a user hat on, I'm liking 64 just fine so far, so in the spirit of the month, Laissez Les Bon Temps Rouler.
The datacenter dns being removed, in a way, removes a lot of accountability for them. They can now say one thing and do another with most people being none the wiser.
Let's take off our soft shoes folks, we're not doing the data center shuffle any more. We've been told it'll be 64. - if and when we're told it'll be other than 64. it's one thing, but as of right now things are shifting and we'll not be dancing back and forth and around and around from one to the other.
We have 64. to look at, when that's been on www all over for maybe 12 hours steady it'll be a wrap but until then we're not doing data centers.
Lets back off and give some air time to the people here who want to look at 64. and have some serious analysis discussion.
This brings up the old issue of "site" vs "page". Historically Google has ignored the concept of site and used PageRank.Is Google now embracing the concept of "site" directly as opposed to indirectly via linking structures & PR? Do we have any solid evidence either way?
Not sure I'd call it solid, but it seems a lot of shopping searches have been populated by pages from sites like Amazon, and I'd sure call those shopping "sites."
In the SERPs I watch, I am seeing quite a bit of reduction in sites which have two pages on the front page in this update.
The authority site will typically have SERP position one and two but now has #1 and #11 or #1 and #14.
I see a lot less of what I consider doorway pages but still see too many of those pages generated by searching at a random competitive search engine and publishing the results.
In the major technical area I monitor, things have improved greatly. The 89 pages which were missing from the top 100 during Florida have not all returned but about half of them are now showing up and several which truly are the authority sites have moved from about #50 pre-Florida to #25 post-Austin and now appear to be up around #12-15 in Brandy.
This still does not return all of the important technical information which an engineer or scientist will be looking for but it does give them pointers to a different way of finding that information.
Perhaps there has been a lot of work done in the area of the algo which deals with route optimization?
A couple of years ago there was a lot of talk about search engines moving to "themes" in order to deal with the load of doorway pages and spam that was arriving back then (Altavista nearly was buried in it). Now it looks like Google has taken it up again and is using applied semantics to better understand the "theme" of the site.
For example, the "Systems Engineering Body of Knowledge" document (SEBOK) of the International Council on Systems Engineering (INCOSE) would rank very high on that scale due to the "perceived" fact that it contains pointers directly to the entire SE Body of Knowledge.
By providing the link to the SEBOK, all other important documents/web pages related to Systems Engineering could be found by the shortest possible route. A link to the Institute of Physics (IOP) page about Distributed Systems Engineering, on the other hand, would be expected to provide some useful information, but much of that would already be reflected in the SEBOK and the IOP page would therefore have a lower route optimization score.
The ability to recognize the optimal size and contents of a body of knowledge and to optimize the route or paths which must be followed to acquire the entire contents of that body of knowledge is a rudimentary exercise in modeling but not a trivial undertaking with a data set the size of the entire Internet.
It is almost like a handful of the old top ten sites (which would rank in the top if there was no filter) were mixed into the Austin filtered results.
I completely agree. I've done a number of "city" real estate type searches across the US and it seems like there is a trend of higher PR real estate sites with pretty good on page optimization leading the pack followed closely by the Florida/Austin type directories. It appears that makemetop may have a good point about the threshold for being considered an authority site has been lowered.
The devoted real estate sites are far better optimized for the "city" real estate query. Now, assuming the authority threshold has been reduced, they can compete.
Route optimization is the mechanism by which pages and documents are ranked relative to their ability to provide the most efficient overall path to the definition of a complete body of knowledge.
But does the user want the "complete body of knowledge"?
Say I'm thinking of buying the Widgetco Widgetmaster 3000. Which of the following would be most relevant:
- A Widget superstore with hundreds of pages about all sorts of Widgetco products. Except the Widgetmaster 3000 which they don't stock.
- A Mom & Pop outfit selling all sorts of Widgets, Gizmos and Ubiqs but which have ten years experience of installing the Widgetmaster 3000
- A Blog that - amongst the soporific rubbish - has a detailed description of the user's bad experiences with the Widgetmaster 3000, explaining why they'd never buy one again and suggesting several alternatives.
If Google want to drop all deep links and only return index pages then site analysis makes sense. Otherwise it simply hides potentially useful stuff.
I still think this is the evolution of Florida and Austin, but I dont think stemming/semantics plays that much of an effective role. I see many results where pages, including my own, are ranking well based on <title> and anchor text, not content. I'll sticky you an example.
They may or may not. It is hard to tell from a single simple search (another hint) but route optimization, like PR and anchor text, and proximity, and stemming, and localrank and about 100 other things, only plays a small role overall in the algo but it can have a huge effect.
Under this theory, whoever is most efficient in providing direct or nearly direct access to the largest volume of the most important and most relevant information, without negatively impacting the other factors, should and does get a bump in rank.
Route optimization is the mechanism by which pages and documents are ranked relative to their ability to provide the most efficient overall path to the definition of a complete body of knowledge.
For example, the "Systems Engineering Body of Knowledge" document (SEBOK) of the International Council on Systems Engineering (INCOSE) would rank very high on that scale due to the "perceived" fact that it contains pointers directly to the entire SE Body of Knowledge.By providing the link to the SEBOK, all other important documents/web pages related to Systems Engineering could be found by the shortest possible route. A link to the Institute of Physics (IOP) page about Distributed Systems Engineering, on the other hand, would be expected to provide some useful information, but much of that would already be reflected in the SEBOK and the IOP page would therefore have a lower route optimization score.
The ability to recognize the optimal size and contents of a body of knowledge and to optimize the route or paths which must be followed to acquire the entire contents of that body of knowledge is a rudimentary exercise in modeling but not a trivial undertaking with a data set the size of the entire Internet.
Hm, yes, but to add to/clarify some of these points:
Route, optimization, mechanisms, cover, (by default) documentative abilities when ranked according to their relationship; (num root) with the efficient paths. This is the means by which PDF documents and their overall internet presence (sic) are established via semi-autocratic knowledge mechanisms. A prime example of this is that the systems engineering (INCOSE) rankings on the forefront, partially-peaked and indexed according to the traditional dampening factor, as yet perceived, focuses on the 64.x index, which is high enough on the paradigm scale for containment of relevancy pointers.
;)
[edited by: James_Dale at 2:07 am (utc) on Feb. 17, 2004]
The whole idea of LSI and applied semantics is in determining the meaning of a "document" (a term we've heard before in this very thread) which suggests an entire site.
I fail to see any logic whatsoever in that assertion. You're just spreading misinformation.
Google has always used the word "document" to refer to an independent file. If you don't believe me, search Google for "The Anatomy of a Large-Scale Hypertextual Web Search Engine".
I can find 7 different IP addresses coming out of 216 with three different search result possibilities, one of which was mentioned above. Three of those are identical to those from 64. I think this just means it's taking some time.
I don't think we will know it's over until all IP addresses from all datacenters show the same results. Until then, let's go on the assumption that Googleguy was being straight with us and discuss the effects of the update.
I was really starting to enjoy this thread again until we started comparing datacenter results.I think we should take the moderators advice and wait until we see things settle down.
Exactly, and thank you. We'll not be comparing datacenter results.
All this '216's on www from the uk' and 'it's 64. from here' are meaningless white noise. That is just the normal cycling of the datacenters.
Exactly, and we'll not be doing any more reporting of data center results in this discussion, which is about the update.
The whole idea of LSI and applied semantics is in determining the meaning of a "document" (a term we've heard before in this very thread) which suggests an entire site.
One portion of the LSI paper that can relate nicely to this concept is IDF - Inverse Document Frequency. While document refers to a single document, how about the fact that some folks are of the belief that increasing the breadth of a site, as well as the vocabulary used, can help with rankings.
Is there a possibility that the frequency factor could be taken into account across an entire site as well as on individual pages within the site?
If so, are the phrases that are downgraded more important from a search volume standpoint?
Yes, yes across many sites for me.