Welcome to WebmasterWorld Guest from 220.127.116.11 , register , free tools , login , search , pro membership , help , library , announcements , recent posts , open posts Pubcon Platinum Sponsor 2014
I've been looking at the WW robots.txt ... ...but why are these spiders disallowed? elgumbo
I can see in the WW robots.txt file that the following spiders are disallowed:
User-agent: grub-client User-agent: grub User-agent: looksmart User-agent: Copernic User-agent: ia_archiver User-agent: ia_archiver/1.6 User-agent: Alexibot
I can understand the reason for disallowing most of the other agents but not the above.
Can anyone clue me up on why WW don't allow these agents?
They're a waste of bandwidth?
I thought it might be that but wasn't sure if there was another reason?
If bandwidth is the only reason then I will allow them.
The site doesn't struggle from too many visitors at the moment - so I think I can live with giving the excess bandwidth to the spiders... for now.. ;) Brett_Tabke
> User-agent: scooter
Allowed AV in forever. They had 50k pages indexed and were spidering them 20-30 times a year. They sent us a total in 3 years of 350 visitors. They cost us a 1000 fold in bandwidth what they sent us.
>User-agent: grub >User-agent: looksmart
Homey-don't-play-dat. Worse numbers than AV.
> User-agent: Copernic
Why would we even consider it?
> User-agent: ia_archiver
> User-agent: ia_archiver/1.6 > User-agent: Alexibot
Security and liability risks.
> User-agent: ia_archiver > User-agent: ia_archiver/1.6 > User-agent: Alexibot
Is it true that alexa ignores this?
> Security and liability risks.
Can you explain? Are we talking about deleted posts that may be archived, etc?
Are we talking about deleted posts that may be archived, etc?
I think so. Also, any threads that may need to be deleted at a later date.