I have converted my site from asp to aspx(.net), since its conversion I continously facing problems. One of them is that my site 15 official pages are restricted by robots.txt file. It is happened when I got xml sitmap to resubmit on google. After resubmitting xml sitemap, a message was dispalyed there that 15 url(s) are restricted by rotbos.txt file. Even I used the following user agent:
user agent: *
I gave open visits to all bots in the above mentioned.
Please help me out what to do? Is .net is bad for google bot or any other bot?
The code is invalid, with incorrect keywords, incorrect case, and incorrect use of line space. It must conform exactly with the format given in the robots.txt paper -- if you want it to be accepted and correctly-interpreted by all robots, there is no wiggle-room whatsoever.
User-agent: * Disallow:
Note that blank line at the end. A blank line must appear after each 'record' in robots.txt.
Hey thanks for the help, can you please explain what is the right code or syntax for robots.txt file. Actually I want to give an open visit to all crawler, and it seems that after implementing new robots.txt file no bots taking visits. Even google bot had its last visit on 23rd April, 2008. Please help me out my site is loosing its position in different search engines.