Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: goodroi
I have converted my site from asp to aspx(.net), since its conversion I continously facing problems. One of them is that my site 15 official pages are restricted by robots.txt file. It is happened when I got xml sitmap to resubmit on google. After resubmitting xml sitemap, a message was dispalyed there that 15 url(s) are restricted by rotbos.txt file. Even I used the following user agent:
user agent: *
I gave open visits to all bots in the above mentioned.
Please help me out what to do?
Is .net is bad for google bot or any other bot?