Ralph_Slate - 4:04 pm on Aug 29, 2013 (gmt 0)
I run an information site that consistently gets beat out by Wikipedia. It is particularly painful because the Wikipedia editors have often used the corresponding page on site as the seed for their page. However, I have to say, in many cases due to the additional information added, my site deserves to be beat by Wikipedia (though it does bug me that my page was used as the authority site for a chunk of information on their page). However, there are many cases when the Wikipedia article is thinner than my corresponding page, yet Wikipedia still trumps me.
I think that is what webmasters have an issue with. Yes, it is reasonable for a thin page on a consistently fat site to rank well. However when a better page exists, that better page should get the extra edge. That is a harder problem to solve, though, because "better" can be too easily gamed on a page-by-page basis. It is safer for Google to return Wikipedia as #1 because their overall quality score trumps just about anyone else. I would imagine that the only way to beat it is via valid links because that is a strong indicator of quality - the people voting. Not always, but usually.
I don't really see this as "brand bias". It is more like "reputation bias". Amazon is a really good site for shopping. They have an impeccable reputation. If I had to guess which site would offer the best shopping experience for a widget, I would pick Amazon versus Widgets-R-Us.com, unless I had information that said otherwise.