Forum Moderators: Robert Charlton & goodroi
If so, how would they rank sites?
What do you think the advantages and disadvantages would be for doing such a thing.
When do you think this may happen? 1 year, 10 years?
I personally feel that links have gotten google where they are today, and they are still perceived to be the best, most accurate search engine around. So if they stop using links to rank their sites they will have a serious hurdle to overcome, and it will not happen for quite some time.
1. PR - and the mere fact that a link exists
2. Anchor text
3. Surrounding text
4. Page Title on linking page
5. Age of links
6. How rapidly links appear
7. Whether linking sites are "affiliated"
...and more, I'm sure. That's a lot of factors to just go away. However, if computerized analysis of language matures significantly over the next 5 years or so, then link analysis may be able to get downplayed a bit, at least, beginning with English language pages.
No, like tedster said, probably 90% of the algo is based on links--one way or another.
The huge success of Google was largely based on their incoming links algorithm,
one which depended on those 'votes' as a best measure of the value of a web page.
This being a clear 'known', it obviously led to abuses, and I presume G and Y have acted accordingly.
Half of the howling pains on these forums comes from site owners who didn't play fair.
I am reminded of a jaywalker who wants to know why all the other jaywalkers
didn't get struck by a bus. Or worse yet, why their Ciagra Farmacia doesn't get into DMOZ.
-Larry
perhaps we are just starting to see the new aproach with personalisation and more advanced AI guessing at the actual meaning and context of what you are searching for...but for sure what we see today is the tip of the very tip of the iceberg in terms of what tomorrow will bring...
What do you think the advantages and disadvantages would be for doing such a thing.
User data is more democratic an less manipulable. Now just webmasteres can vote.
When do you think this may happen? 1 year, 10 years?
Hard to say ;-)
Google started from ideas like PageRank and anchor text. Now they have more factors. Not only inbounds, but also outbounds are used in ranking, and not all advantages of using outbounds are exploited - I still can imagine many ways of interpreting outbounds in order to detect spammy sites that are obviously not used by Google yet.
There will be more sophisticated factors used in the future, but links will always matter, exactly as on-page factors still matter, though less than in old Altavista years ago, when simple keyword stuffing was enough to get #1 for competitive keyphrases.
And there is no good in devising too complicated ranking factors, as the thing user searches for is the content, and the content is obviously the on-page factor. Sometimes adding too many alien factors is only increasing the noise.
The third generation of SE will be based on user data.
Why do you think G is pushing their toolbar so hard?