Forum Moderators: goodroi
Now the situation...
I can just write robot.txt on the browser and can get to know all the names of private/protected directories and files existing to which the administrator of the site doesn't want any public access. {as it is a text file}
And now the site actual becomes more vulnerable to attacks as the internal protected directories and files names have been known and can be used by any hacker to hack it...
So what is the solution to allow bots to index the site leaving the protected files/directories and also not being vulnerable to attacks...
if you have sensitive information it should be blocked with htaccess or isapi. put it behind password protection. if possible dont even put sensitive information online.