Lots of discussion on whether and how google might be using user metrics as part of their ranking algos. Been driving me nuts.
The short: IF google is using visitor metrics as a significant part of their algo, it might explain why the serps seem to have gotten worse, because, visitor metrics are an exceedingly poor and circular way of evaluating sites.
User behavior on a site is complex. But basically, it comes down to whether the page yields what the user is looking for. In effect, it's a result of the match between want and find.
But here's the thing. What affects that match? Certainly there's on page stuff -- junk is not going to fit a match. BUT the major determinent of match is up to Google.
Google chooses both the title and serp description, so if it changes things (which it has started doing, and does a poor job, match gets worse.
More important google the degree that Google can get in the searcher's head will determine whether the results are a match for the user's intentions. To the degree google succeeds, there's a match (all things being equal). If it doesn't get it right, then the sites it shows (and are clicked) are going to get high bounce/exit rates.
In effect, GOOGLE determines the fit.
Other factors also enter in to it, of course for various types of visitors. Inbound links, for example affect the match (even if Google is left out of the equation. Good, descriptions and links allows good matches. Then good user metrics. Bad descriptions and you get more unmatched traffic.
The point being that while user behavior certainly has something to do with onpage factors, it's NOT under the control of the webmaster. You can have an amazingly engaging high quality website, but if Google pushes the wrong visitors to it, you'll get poor user metrics.
So, user metrics have much less to do with quality, or even if people like a page, but have everything to do with the match, which is heavily controlled by google.
So, if google relies on user metrics, while it may be evaluating some onpage aspects, what it's really doing is actually evaluating ITS OWN ABILITY TO MATCH USER INTENT to page content.
So it goes round and round. Google has trouble with certain pages and sends irrelevant traffic. High bounce rates. Google sees high bounce rates and then drops the page in the SERPS REGARDLESS of quality.
You can't have an algorithm assess something, when in fact, you are evaluating the success of the algorithm itself. It doesn't work. It's a logical error. It's, in effect, a weird loop.
PS. One could argue that it's the webmaster's fault if Google can't get what the page is about, or the searcher's fault if s/he can't search well. Both could be true, but for the first, given the last 18 months, I don't believe anyone who says, the know how to tell google what their pages are about so they get the "right" visitors.
Ok. Oversimplified a bit. And if Google is NOT using user metrics much (and it shouldn't) then all bets off. But if it IS, it might explain why absolutely dreadful sites are showing up.