Forum Moderators: Robert Charlton & goodroi
1. Analyze the search terms of the user to find out, whether the user is looking for information or a website that offers a particular service.
2. Apply different ranking criteria for different types of websites.
2.1 Informational websites
For informational websites (the original content of the www), the current google algorithm can still be used. This algorithm envolved linking by other websites, keywords, titles/meta tags and generally keyword density and the position of relevant keywords within the context.
2.2 Service websites
If the search query indicates someone is looking for a service on the internet, e.g. he wants to buy and sell stuff, looking for a house, car, date, mainly the Alexa rank or similiar factors should be taken into account for the first results.
In the case 2.2 it is irrelavtn what the titles and meta tags look like and who is linking to the website...Google could even ignore all that for those queries - and not even care about duplicate content or whether anyone is trying to fool search engines - the busiest website is usually the most popular one - in contrast to informational websites. The most accurate and scientific information is often hidden somewhere are scientist usually dont care about search engine rankings as they are too busy with most important stuff.
If you want to sell a car for instance, you usually want the website with the most traffic and not the best-optimized website....just to give an example. Therefore I find it ridiculous with the title or anchor texts as well as alt tags of images make still such a big difference.
I think the Alexa rank is the most important factor for those type of queries. But Google has probably his own "Alexa rank" algorithm and knows itself which websites are busy and which ones aren't.
Additionally Google could also take social bookmarking results into account (not blind-foldedly though).
Thanks for listening!
And in the fairly short term, this should kick the bovine excrement (which is to say, the entire content) out of those blasted sites that optimize for, say, "Anniston Hotels"/"Boston Hotels"/"Charleston Hotels"/"Denver Hotels"/"Etc. Hotels" (as fronts for HotelNow.com) and direct people to the hotels themselves.
However, only if you look for a specific Hotel. This is similiar to trademark issues. I think names and trademarks should always be shown first for searches in general.
However, if you look for "hotel townname", the search result should show the most relevant to the user:
In this case:
(a) the most popular hotel
(b) the busiest hotel website from that area
(c) the most popular hotel booking system (e.g. that give huge discounts and give you the hotel cheapest).
(e) a review website with reviewed hotels from that area - and the one with most reviews not just a dummy website that has placeholders for all hotels in all towns.
And all of those factory are independent of keywords and linking. Just by analyizing website traffic and use behaviour as well as social bookmarking features etc...as well as user reviews.
Once again: the primary search engine algorithms are old-fashioned. None of the results above have ANYTHING to do with keywords, title, meta-tags and link structures.
You always need to ask yourself "what does a user want to see" and not "what keywords fit best" or "what is linked best".
At the moment keywords and their positions as well as links still count too much - which makes the whole web go insane.
Google cannot make the semantic leap to doing what you'd like; even a human being, faced with a search term alone, would often have no clue whether the searcher wanted info or service or both ... let alone match the appropriate websites offereing info, services, or both.
SEs have got much better at 'intuiting', and if rumors prove true, this years developments in personal search will be a giant leap forward in the process.
But maybe you are suggesting in the wrong place; Google already does these things better than its rivals - maybe they nee the suggestions, to keep competition alive ;)