Thinking about the "magic words" mentioned. I went a grabbed these examples from my robots.txt report;
22.214.171.124 Mozilla/5.0 (compatible; SISTRIX Crawler; http://crawler.sistrix.net/) /robots.txt
126.96.36.199 Mozilla/5.0 (compatible; SearchmetricsBot; http://www.searchmetrics.com/en/searchmetrics-bot/) /robots.txt
These bots belong to SEO sites. My question is, what's the consequence, server work wise, of blocking them or not blocking them.
Really I'm trying to find the logic in blocking a particular bot or ip (just because). Is it just bandwidth?
These particular examples just crawl as far as I know. I don't need them, or like them. One is housed at Hetzger.
Would these get blocked by you guys? Why, why not?