Martin_Ice_Web - 8:32 am on Nov 13, 2012 (gmt 0)
taberstruths, because if u measure something u need a leveling point to compare the results, we call it normalize. If u have 3 sites that are "prefered" for a key then a testing can only by sending these pages from the same point.
But what i see is, that the garbage pages that replace the "good" sites have very, very low content, have little html-code and in most cases have nothing to do with the search itself. This sites have no deep interest in giving the reader information but look like "kid"-pages, where someone catched something up in a smalltalk and now writes it down in one sentence. google levels this one with good content sites.( except brand, authority sites ).
This and like backdraft says fresh meat makes the AI high performance super peguin/panda algo produce this silly results.