I haven't been hands-on with all of them - so I really can't give an informed opinion.
Another point I wanted to make earlier in the thread but forgot - site speed is NOT measured by googlebot, it comes from user data (toolbar installs, Chrome, and the like.) Googlebot only measures server response times. It's important to keep those two factors clear. So you want the browser to send a "page loaded" signal as early as possible. If that happens well, you've boosted site speed.
These areas are where I see a lot of sites blowing it with Site Speed, even as they worry about a kb here or there on their html or scripts. Do the actual file compression well first, for both text assets and images.
If you're on Apache, the mod_pagespeed module does a nice job with automatically sending only as much image data as is really needed - no matter what the original file is like. That can be a great savings when you have user contributed image content.