Sgt_Kickaxe - 10:07 am on Oct 24, 2011 (gmt 0)
This is a major design flaw (one which I assume is due to the sheer size of data that needs to be calculated). But, until this happens any high quality content we have is getting superseded by the fact there's some low quality content or structure "somewhere" on our sites.
I couldn't agree more! Let me explain why.
As Google breaks away from evaluating on a page level in favor of slightly adjusting page A rankings due to content on page B they are ignoring the fact that not every page is a home run. I do see the wisdom in grading a source however in this case that is counter-intuitive to the desired goal.
e.g. My website has a lot of pages, some of those pages are fantastic and some are boring but necessary to support others. Each webpage however is a complete work and even the worst of sites is capable of having a diamond in the rough, my site is no exception. While it may seem like a good idea to come up with an overall score for my website, as if judging a book, it's a terrible idea if your goal is to find diamonds in the rough, something Google used to be good at. It has the effect of burying good pages in rankings and that's not good for anyone.
~ Do you read every page of a newspaper? No, you pick your favorite section first and some sections are junk to you. Google has seemingly forgotten that. The argument that "but what if the newspaper is trashy with a bad rep, would you search for that page?" doesn't hold water because I have Google do the searching for me, I want diamonds and currently that's not happening as frequently as it should/used to.