Convergence - 4:47 am on Jul 2, 2013 (gmt 0)
Well, phranque - It's doing it right now. 175 pages in the /review/ directory. What is interesting is it is one right after another. There is NO sitemap for reviews; so the Googlebot has a list of URLs found on our product pages and is following-up, trying to scrape (yes, I said scrape) reviews for nothing else but It's use.
No, the disallows are NOT bot specific.
Blocked by line 13: Disallow: /review/
Detected as a directory; specific files may have different restrictions
And in the header of every one of the pages from the directories that are blocked in robots.txt:
<meta name='robots' content='noindex, nofollow' />
Don't know what to say, I can see see it happening with my one good eye...