I noticed a recent increase in AI bot traffic on my site. I wouldn't bother much, if it weren't for the fact that website has an immense number of pages, significant part of which are dynamically generated and there is a fair amount of computation behind each generated page. The bots seem quite greedy, making requests every second, sometimes several per second, so it is starting to affect site performance. Not to the point where it becomes slow, but to the point where it stops being lightening fast (just went through a recent server upgrade). So, my question is - is there any use for me as a webmaster to have AI bots cruise around my site or should I just disallow them in robots.txt?
I'm strongly leaning toward disallowing bots, as I don't see any value they could potentially bring to the site - they are only using information my website provides without any linking or references to my site, so they can't bring any traffic to me. Right? I have zero interest in paying for server resources just to provide various AI models with tons of data. Or is there any potential benefit in letting them scrape my site, something that's not immediately apparent to me?