I have been trying to understand why my site is not being indexed. I use positiontech to submit to Ink and they have saying there is a robots.txt error and today said they beleive my host is blocking the crawlers. I contacting my host who says its fine and now it return a page that just says test.
When I checked your site it is showing test so you must have a file called robots.txt. if you don't have a robots.txt file then it will return a 404 error (The page cannot be found)
See if you can FTP to your web space and view what is on your robots.txt file hopefully it says test so replace this with your file or even upload a blank robots.txt file in ASCII mode.
You dont have to have a robots.txt file it is only used to block spiders not to invite them, the only down side not having one is it fills your log file with 404 errors when it is requested by search engine spiders
If you're on a virtual host & have ftp access, you should know wheter there's a robots text in it. Robots txt cant do much in a way to block your index page being spidered. If you believe your host is blocking se crawlers than i believe they might have done it on the main server settings, conf if on apache.. dont know about what ink is talking about though. If it's a robot.txt you surely can see what's in there. I've just checked your site profile /robots.txt & i get a 404 means it's missing. Then I suspect if what the 2 ppl/ink is saying when they see test when they request for your robots.txt is probaly your host provider have set a rewrite rule for anybody requesting the robots txt to display test.