Our site has been submitted worldwide, which for some companies may be a good thing. But we often receive either bogus requests for product or spam from foreign countries. We do not deal outside the United States and therefore it makes no sense for worldwide patrons to view our site. I was wondering if it is possible with robots.txt to block foreign versions of Google?