Forum Moderators: martinibuster
htt*p://news.com.com/2100-1032_3-5145549.html?tag=nefd_top
I simply cannot see how an algorithm that factors in linking can survive long term. I understand the concept: since Google cannot measure the popularity of a site via human "clicks" on a search result, they're trying to implement the same "popularity" function using links. More links suggests higher popularity, etc.
This has so many long-term problems though:
1. No long term competition. As the web grows, and the number of links to existing sites grows, it will be difficult, if not impossible to view *new* sites in search results - the new site will be competing against tenured sites with thousands of links. Entire keywords and search phrases could be closed out for new sites.
2. Link Farms are pointless. The notion of adding my URL to a page that is already made up of 100 other URL's doesn't seem appealing, or useful to my users.
3. Results for many searches are being dominated by directories (link farms) or affiliate sites(the sites that have high PR because they've traded 20,000 links). Google shouldn't be a directory of directories. With emphasis on PR link count, eventually all searches will return directory pages, affiliate pages, or other pages that require a recip. link (I suppose this is tied into my #2 point)
4. FINALLY, A computer algorithm will never be the best determinant of site popularity - people will. I liked the "directhit" idea of measuring clickthroughs, and adjusting search results based on that. I can see fraud in that model too. But, essentially PR is a measure of the popularity of a site amongst web programmers... the people who know how to write an "A HREF" tag. Is the measurement of site popularity amongst us the true measure of relevancy to the *average* public?
I can see PR and links going the way of the keyword meta tag, unless it's weight is dropped or someone can come up with a better way. Google Link Bombing is a big red flag of things to come :P
Just my $.01 cents.
Google wasn't the first to factor in outside links into their ranking algorithm. AV/Inktomi/FAST all have some weighting on outside links, because it is the easiest measurement of how "popular" a site is.
It's an inherent imperfection. I thought Google had a better take on site popularity as it gave more weight to more authoritive sources: microsoft, Yahoo, etc.. But, people found methods to gain enough "less" popular links to gain artificial boost.
As Google continues to filter sources that are known for artificial link development, innocent sites also pay the penalty.
I think the only answer is a "moving" algorithm, where the weighting changes bringing fresh sites to the top. Combined with this the engine tracks click throughs from the SERPS, and historically tracks and tunes it's own popularity data into the algorithm, whereas over time the truly "popular" sites rise to the top. I also envision "anonymous" crawlers built into the system.
Engines biggest problem is publicly releasing the basic weighting/working of the algorithm. Google has been the easiest algorithm to beat out of any SE that has ever existed, no wonder so many say it is the best engine... :)