Google wins, you win and your users win. I can't find anyone that loses unless you take the decision to remove or degrade real content.
Since this is going to be a ranking factor and ranking is a competitive issue it is up to you to decide if you want to compete on this one factor. If you don't that's fine. If you want to not take simple steps to make your site faster then you can still compete by doing more on the other 199 factors.
I wonder how many of us before seeing Matt's comments knew how to implement gzip on our servers or how easy it is to add a Cache-Control line to our htaccess file. I'm sure some will have but I for one had not even considered these things but they make a massive sitewide difference for very little effort. I didn't realise that http includes a check that the browser can deal with gzip when the request for a page is received and then sends the page components zipped up only if the browser can unzip them.
Basically there is a shed load of easy to implement technology already available to us that the vast majority of us are not using and Google has data that shows this. Google also knows that most of us have made a few stupid errors in our coding or image compression at some time in the lives of our sites that sit their costing bandwidth every day and we are blissfully unaware. On my main $ site I've made one pass with the Firebug speed tool and found enough little inefficiencies and unimplemented speed technologies to make a massive difference to my users and to hopefully help cement that top slot on Google. This is one Google algo change that I applaud loudly. Whoever suggested this deserves a medal. Well done Google!