I'm afraid that once again Google is making the Web unintentionally worse (happened many times before, for example when people were afraid to use anything that might cause Google to think they have hidden text on the page, even when it was useful for visitors, like in a "Spoiler" button for movie spoilers).
There is a huge problem with Google's crude attempt to use total page loading time for ranking long pages (that require scrolling down to fully view): it likely uses the total page loading time, not taking into account that in the user's browser the page could be visible long time before that, if he doesn't scroll down. We own very popular websites with long pages and we always tried to optimize the experience for the user by showing him what we can as soon as possible. That meant splitting images and Javascript into small parts that only load when they are actually used in that part of the page. This way the user can see the page on his screen as soon as possible. None of the current tools, such as YSlow, webpagetest.org (recommended by Matt Cutts), or Google's very own PageSpeed understand this, so there is absolutely no reason to think that Googlebot could understand it either.
Traffic from Google rankings is important to us, so we did what we think they wanted: we listened to the recommendations of these tools and combined images and Javascript to make the total page loading time quicker, making our pages appear to load slower to actual users. This is what happens when Google implements crude measures with a lot of secrecy about their methods - the Web becomes worse.