An information retrieval system includes a query revision architecture providing one or more query revisers, each of which implements a query revision strategy. A query rank reviser suggests known highly-ranked queries as revisions to a first query by initially assigning a rank to all queries, and identifying a set of known highly-ranked queries (KHRQ). Queries with a strong probability of being revised to a KHRQ are identified as nearby queries (NQ). Alternative queries that are KHRQs are provided as candidate revisions for a given query. For alternative queries that are NQs, the corresponding known highly-ranked queries are provided as candidate revisions.
So we can glean from the desciptions, especially in the paragraphs in the 0040s and 0050s, the rationale behind giving further suggested (and refined) phrases right under the top ten query results returned, based on ambiguity, user-generated query revision and behavioral statistics.
But is there any reason why the same precepts can't be used to project better focused results and return results with a set of the top pages returned that have being reranked according to what's been statistically determined? Like the top 10 results? Or maybe even the top 6?
Msg#: 3566362 posted 4:13 am on Feb 5, 2008 (gmt 0)
Here's what that suggests to me, especially keeping in mind the recent position #6 episode.
The phenomenon certainly seemed to affect KHRQ's (known highly ranked queries) - even those that triggered suggestions for NQs (nearby queries). Suppose you have a high ranking url that ALSO ranks near the top for one or more NQ. Is Google giving an extra boost to those urls that would also tend to rank well on one or more revised queries, and demoting those that would not?
Msg#: 3566362 posted 12:38 pm on Feb 5, 2008 (gmt 0)
this is pretty much what I've been describing ( #6 ) as the possible intent
if you don't fit the pattern of ANY likely 2nd searches ( revised queries? ok... revised queries ) you don't rank for overly generic phrases that'll lead there anyway. but I stick to my observation of Google only caring and closely monitoring queries that are *also* AdWords sensitive ( inc. trust thresholds, reranking, etc. )
dunno if I got it right this time around, but I liked my idea and will likely do SEO based on this concept *grin*
only makes sense not to rank niche specialty sites above the most popular themes for a generic query. not even if SEO'd well
[edited by: Miamacs at 12:44 pm (utc) on Feb. 5, 2008]