We're all used to seeing malicious requests in the server logs coming from all over the place. It's much more rare to see Googlebot making such requests.
Yesterday, I was surprised to see a whole load of "Access Denied" entries in the WMT Crawl Errors report.
They were all of the form: /shop/update.php?id=836+AND+1=5+UNION+SELECT+0
These requests now show in the Crawl Errors report since all URL requests with UNION or SELECT or a whole list of other banned terms are denied whoever requests them.
However, what surprises me is that Google would even attempt to request such a URL. I'd have thought they would filter out malicious URLs found in links so as not to do other people's dirty work for them.
As expected, the Crawl Errors report doesn't list where this malicious link was found.
One other thought comes to mind.
Could it be that Google invented those URLs merely for testing the site security in order to rate it, in the same way they request /noexist_1b4c6325b27d2a.html style URLs from time to time?
Since many of the common CMS, blog, forum and cart packages have inbuilt limitations and design errors, it would make sense that Google detects what system you're using and then applies a set of known fixes to the data they get back as they crawl the site.