Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

Site Pages are Blocked by Robots.txt



4:46 pm on May 1, 2008 (gmt 0)

5+ Year Member


I have converted my site from asp to aspx(.net), since its conversion I continously facing problems. One of them is that my site 15 official pages are restricted by robots.txt file. It is happened when I got xml sitmap to resubmit on google. After resubmitting xml sitemap, a message was dispalyed there that 15 url(s) are restricted by rotbos.txt file. Even I used the following user agent:

user agent: *


I gave open visits to all bots in the above mentioned.

Please help me out what to do?
Is .net is bad for google bot or any other bot?




8:11 pm on May 4, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

I would just remove the robots.txt file as you don't seem to need it. After that everything should be fine again.


8:26 pm on May 4, 2008 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

The code is invalid, with incorrect keywords, incorrect case, and incorrect use of line space. It must conform exactly with the format given in the robots.txt paper -- if you want it to be accepted and correctly-interpreted by all robots, there is no wiggle-room whatsoever.

User-agent: *

Note that blank line at the end. A blank line must appear after each 'record' in robots.txt.



10:03 am on May 6, 2008 (gmt 0)

5+ Year Member

Hey thanks for the help, can you please explain what is the right code or syntax for robots.txt file. Actually I want to give an open visit to all crawler, and it seems that after implementing new robots.txt file no bots taking visits. Even google bot had its last visit on 23rd April, 2008. Please help me out my site is loosing its position in different search engines.




Featured Threads

Hot Threads This Week

Hot Threads This Month