Forum Moderators: goodroi
So I put in-place the basic ....
User-agent: *
Disallow:
to return a status 200.
Now different link software's from various sites are being stopped from crawling my site to check for their links back to their sites. The robots.txt is stopping them as when I remove the file everything is working OK.
The only reason I have/had .......
User-agent: *
Disallow:
file on my site was I thought it was good practise to return a status 200 and is part of building a professional site. Or do I not need to worry about it just leave that file off as I'm not disallowing anything anyway.