londrum - 8:38 pm on Oct 18, 2011 (gmt 0)
i actually think that creating an algo should be getting easier, not harder, because half the stuff that they used to rate sites in the old days no longer apply.
eg. onpage stuff is practically dead now. apart from obvious things like titles and maybe pictures.
even links are going to fall by the wayside soon, because 99% of users aren't in a position to drop links. its a webmaster thing. so it doesnt make sense to use it as a ranking signal.
its all going to be about social signals. and they are a lot easier to count up. how many people talk about this page, compared to this page? etc. how many people bookmark this page, compared to this page? etc.
given that google follows a sizeable chunk of the web through its browser, analytics, ads, +buttons and everything else, the data should be falling in their laps.
other than the fact that the web has got so big now that its impossible to crawl in its entirity (which is presumably why google has been so hot on trying to get us to remove dud pages and speed the other ones up), why should it be any harder to rank pages than it was 5 years ago?
what has actually changed? web pages are still the same as they ever were. its still pages, text and pictures.
if the algo is getting worse then i reckon its because they are overcomplicating it.