wilderness - 9:52 pm on Mar 7, 2013 (gmt 0)
One is housed at Hetzger
Oh my, another bad word.
These bots belong to SEO sites. My question is, what's the consequence, server work wise, of blocking them or not blocking them.
Really I'm trying to find the logic in blocking a particular bot or ip (just because). Is it just bandwidth?
In years past there have been many explanations of these blocking practices in this forum.
Many times, those posing the logic were trolls promoting a bot. Are you a troll ;)
What it comes down to is choice, and each webmaster must (?) decide what is beneficial or detrimental to their own websites.
I deny most visitors beyond the borders of Canada and US, and my denials are far a broader topic than your focusing upon.
Many of the bots, crawlers, harvesters, or what ever else you choose to call them, simply offer no benefit to allowing the door open.
Their almost like flies or ants. If you have one, you have more.
What are you choices when all these harvesters begin plagiarizing your content (text or images)?
Is it cheaper to hire an attorney and enter in to the long process of litigation, or simply slam the door in their face because your able to foresee their intent?