Forum Moderators: goodroi
/myfile.php?id=1
/myfile.php?id=2
/myfile.php?id=3
as well as /myfile.php.
If you wanted it to block only /myfile.php but allow crawling URLs with /myfile.php followed by some query string parameter then you could use:
User-Agent: *
Disallow: /myfile.php$
Basically, if you don't specify a '$' on the end of the disallow, there is an implied wildcard ('*') on the end of the disallowed URL.
If you want to block all URLs regardless of where they live in your web directory structure if they contain the string 'myfile.php' then you could use:
User-Agent: *
Disallow: /*myfile.php
There is a lot of info on robots.txt if you simply search on it at G... wikipedia, G webmaster tools, etc. all have articles on its use.
[edited by: ZydoSEO at 3:45 pm (utc) on Feb. 7, 2008]