SevenCubed - 10:32 pm on Oct 10, 2012 (gmt 0)
I was thinking more along the lines of a site being built to something like the 1000 signals that Bing parses for as compared to the 200 that google says they use. In other words the site technical quality (mostly) exceeds their ability to contain it within the confines of maybe their PR algorithm for example. I've seen new sites pushed out that then caused a cascade of PR updates for other sites in that niche that shared the SERPs with that new quality site. That's probably why toolbar PR updates are only infrequent. That's a whole other topic though.