Forum Moderators: goodroi
I need to be able to allow a search spider (Ultraseek) to search content on my server but no luck! I think Ive tried everything. Any help is appreciated.
returns this error:
503 socket.gaierror: (7, 'getaddrinfo failed') (config.py:3919):
Copied below is the content of the robots.txt file (also, tried with a blank file, didn't work):
User-agent: Ultraseek
Disallow:
User-agent: *
Disallow: /
Any ideas?
If this is your robots.txt file:
User-agent: *
Disallow: / That means disallow all robots from all directories on the site. Ultraseek and any other bot that follows robots.txt will see that and leave your site.
Keep in mind that robot.txt is used to restrict access to areas of your site, not allow access. We have an entire forum dedicated to this topic: robots.txt [webmasterworld.com].
To allow a single robotUser-agent: WebCrawler
Disallow:User-agent: *
Disallow: /[robotstxt.org...]
I'd say the issue appears to be a server configuration problem, more than a problem in the robots.txt file itself. Can't find anything detailed enough on the web, but I would suggest looking into that specific error message. Python is out of my comfort zone, so I can't offer much more than that.