Forum Moderators: goodroi
Robot.txt also known as the Robots Exclusion Protocol and function as a request that specified robots ignore specified files or directories in their search.
For example:
User-agent: *
Disallow:
This will allows all robots to visit all files because the wildcard "*" specifies all robots:
And,
User-agent: *
Disallow: /
This will keeps all robots out.