Forum Moderators: Robert Charlton & goodroi
Here's another example. I put up a website last week and threw a single link to it from another related site. A search for the two keywords in it's domain, plus the TLD (keyword1keyword2 TLD) shows nothing in Google except whois type sites. BING shows similar results. Yahoo however ranks it number one, displaying a snippet from my meta tag.
I think the Google Algo's dependence on the amount of links has traditionally prevented Google from displaying deep and obscure content, capping the amount of content it displays in the SERPs. Even if that web page is the only matching content, Google seems to ignore it and choose not to display it. Maybe this is related to the article you read? It's definitely something that's been going on probably since day one if the "random surfer" scenario from the original Page/Brin Stanford paper is taken into account.