Welcome to WebmasterWorld Guest from 54.158.51.150

Forum Moderators: goodroi

Message Too Old, No Replies

Webmaster World's Robots.txt

Why don't they allow msnbot?

     
5:33 pm on Nov 28, 2004 (gmt 0)

10+ Year Member



Webmaster World has a lengthy robots.txt

[webmasterworld.com ]

I can see why they don't allow a lot of it, but I was wondering why they don't allow msnbot?

5:40 pm on Nov 28, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



don't assume that the robots.txt you see is the same one served to spiders... that is not always the case.

As for blocking msn, it has been known to spider excessively for no obvious benefit to the webmaster (while consuming the webmster's bandwidth allocation) so some sites block it.

6:06 pm on Nov 28, 2004 (gmt 0)

10+ Year Member



<<don't assume that the robots.txt you see is the same one served to spiders... that is not always the case.>>

By "the one that you see" did you mean "the one that your browser might get"? Because browsers don't make requests for Robots.txt at all.

What type of server gives up different versions of the file for different requests/user-agents/spiders?

7:28 pm on Nov 28, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What I want to know is how much good, if any, is there in blocking ‘bad’ spiders, like some of the ones listed in the robots.txt mentioned above, when the scummy people using such bots can just change the user agent?

Should all of them be listed in the robots.txt file, or is it a moot point?

9:25 pm on Nov 28, 2004 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member



ASW,

> What type of server gives up different versions of the file for different requests/user-agents/spiders?

Mine do. It's one way to cut bandwidth consumed by robots that don't understand multiple-user-agent records. Detect those UAs and serve them a simplified robots.txt with their UA string inserted. A combination of mod_rewrite and some simple cgi scripting on Apache can be used to do this easily.

jim_w,

Some "bad" robots are in fact spoofs of legitimate user-agents. In cases where the legitimate robot visits but is considered to be of no practical use to the site owner, it may be Disallowed in robots.txt. It is in fact necessary to take stronger measures for the spoofers, but having the robots.txt disallow helps identify the spoofers (because they don't fetch robots.txt, or they ignore the contents of robots.txt even though they do fetch it. So no, it's not entirely a waste of time.

Jim

6:11 pm on Nov 30, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



check out this

[66.102.7.104...]

no cache, I wonder how google is getting to all the pages of webmasterworld.

[216.239.57.104...]

no page of webmasterworld has a cache but its getting indexed may be links from outside.

AjiNIMC

 

Featured Threads

Hot Threads This Week

Hot Threads This Month