who says that valid html always results in a faster loading page?
A browser's error correction routine might add a few milliseconds to the page load time - depending on what those errors are. Latency in DNS look-ups will add more than that, and essentially swamp the error correction time out of detectability.
Test it for yourself with two versions of a page hosted on the same server. Then run one of the page speed tools several times on each version. You won't be able to isolate the effect of invalid mark-up in those test results.
Here's our coverage of the speed issue when the tools first became news
The Need for Speed - Google shares help and research [webmasterworld.com]. There is a world of information available by following the reference links in that thread.
But none of the page speed tools, or the research itself, or the books published from that research, ever measure or even mention coding errors as a speed factor. Google Webmaster Tools does not mention it. Even Google's own pages do not all use valid code - their home page currently triggers 36 errors on the W3C validator.
In fact, you probably could improve page speed by removing the quotation marks from around all your attributes - and that would actually turn valid mark-up into invalid mark-up.
The engineer who did the bulk of the research on page speed is Steve Souders. He has two books in print on the topic and I recommend both of them. The original research was done while he was at Yahoo and resulted in the YSlow tool. Then he was hired by Google and became their "Performance Evangelist" and helped push out Google's own Page Speed Tool.