Robert_Charlton - 12:23 pm on Dec 15, 2012 (gmt 0)
I'm a little confused by this thread, as it seems to want to look at the where the algo's going in broad conceptual terms, and I applaud that intention... but it's constantly drifting into minutiae, and I'm wondering why.
I'm not seeing core terms like "user engagement" in this discussion, and it should be there (engagement obviously influenced by intent). Is it because this factors is by now taken for granted, or, for some reason, has this discussion just not gotten to it yet? We've been alluding to user engagement fairly steadily in this forum for over two years now, and I take it as almost a given. It certainly explains a lot of rankings I've been seeing. Originality is another such factor.
Why, then, are we even mentioning trivial tidbits like meta keywords? IMO, those are ancient history, or at best miniscule and isolated points in a very big picture.
Perhaps a beat or two in the discussion has been missed....
The advent of Caffeine, coupled with the speed and power of phrase based indexing, has enabled Google to identify semantic relationships among the content of pages much more closely than before, improving the assessment of links, and enabling the association of sites and pages with user behavior and query intent over time. Caffeine has enabled Google to bring together a mind boggling number of factors in evaluating its results.
Very briefly, as its late at night... of course the New York Times or Amazon are going to outrank a personal site on queries where they explore roughly the same subject matter. User intent isn't a static thing... In ecommerce, user intent is influenced by where a searcher might be in the buying cycle.
In relation to a product or a subject of inquiry, the user's evolving depth of experience is going to want to be satisfied as the user gains experience and develops new needs and curiosities about a topic. A good article ought to send a site visitor off on a deeper exploration. If a site has the depth, the visitor will stay on that site, and return.
In personalized results, users who have site preferences probably have them because the site answers their questions over time. Chances are that site size, diversity, and depth of content for many kinds of material does matter.
For non-personalized results... given a basic quality level... the likelihood a site will satisfy a query also increases with size. A single journalist or small group of writers, sufficiently talented, focused, and hard-working, can compete with a large organization. I've seen it happen, but not everyone can pull it off. Nate Silver, though, was so good he got bought by the NY Times.
Links most definitely help. I see their influence every day. And site performance does matter. But, for some sites and some queries, I think that Google is no further along than it was a few years ago. There's no one algo. It's a statistical model, attempting... from what I can tell... to satisfy a range of intentions for a given query, with those prioritized by user demand. Personalized results may be more focused.
Where Google had dropped host crowding in favor of "brand authority" for a while to check a different way of emphasizing pages, I see those results evolving over time as the data is accumulated. But those, too, are clearly prioritized by user demand. Why else would Google try such a radical experiment?
...many people... are in some ways crushing (or at least suppressing) their own rankings by constantly clicking on the competition for the phrases they want to rank for?
...I can't say that this hasn't crossed my mind, but I'd look at the other side of it... what happens if you just click your own site and leave too quickly too often? That might skew things more, though I doubt it would be a major influence... certainly not enough to explain why exact match links seem to be becoming a lesser factor. User engagement, probably segmented by niche, IMO is a better explanation.