Welcome to WebmasterWorld Guest from 184.108.40.206
Whilst the two bots are for a different primary purpose, I don't see any reason why they shouldn't share page source code once retrieved (they can process it entirely differently of course).
Seems a bit of a waste of effort on Google's part; there is absoltely no reason why a page retrieved by Mediapartners-Google should not feed content to the index based on Googlebot's robots.txt restrictions. Bizarre!
While it would save Google some bandwidth
It would also save _me_ some bandwidth; which was more to the point!
Participating in Google AdSense does not affect your site's rank in Google search results. Google AdSense will not affect the search results we deliver. Google believes strongly in freedom of expression and therefore offers broad access to content across the web. Our search results are unbiased by our relationships with paying advertisers and publishers. We will continue to show search results according to our PageRank technology.
So it could be that mediabot does get specific pages spidered and into the serps, but that it doesn't make a difference of which bot it is that gathers the page information for the serps.
I'm 100% sure mediabot uses data from googlebot, maybe from the cache.
I have a site with huge parts excluded via robots.txt, but there are adsense ads on the excluded pages, and I have never, not even once, seen mediabot touch the non-excluded pages. Mediabot only takes the excluded pages.
From a bandwidth point of view it makes sense to let mediabot use the data from the normal googlebot runs.