Forum Moderators: goodroi

Message Too Old, No Replies

Robots.txt stops automated link software

         

jdhuk

2:36 pm on Jul 18, 2005 (gmt 0)

10+ Year Member



I want all bots to crawl my pages everyone is welcome :-)

So I put in-place the basic ....

User-agent: *
Disallow:

to return a status 200.

Now different link software's from various sites are being stopped from crawling my site to check for their links back to their sites. The robots.txt is stopping them as when I remove the file everything is working OK.

The only reason I have/had .......

User-agent: *
Disallow:

file on my site was I thought it was good practise to return a status 200 and is part of building a professional site. Or do I not need to worry about it just leave that file off as I'm not disallowing anything anyway.

Dijkgraaf

11:01 pm on Jul 18, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Well then the link software is badly written.

You could just have a completely blank robots.txt file.