Seb7 - 1:22 pm on Aug 8, 2010 (gmt 0)
Personally, I would try to locate their crawler, try and find out what kind of IPs, agent string being used, speed of crawl. Then write a bit of code that knows when its them and block it, or maybe send a different view of the website, given a negative description of the products! Hopefully they will update all their content without noticing! I always have a central include file for all pages which is perfect for such a job.