If I have 50 sites operating on a server, do I need to place a robots.txt file in ever domain root folder or can I place it in one central location that can be used by all of the domains? Sorry for the newbie question.
I think you have to put the file in the root of all the sites. Presumably each site is different and would probably want to disallow files/directories based upon their structure - unless each site is identical?
If you just want to use robots.txt to ban particular user-agents (and therefore the same file would be ok for each site), then you can, if you're on a *nix box, maintain a single central copy; and create a soft link to it in each directory.
You still have to create all the soft links, but after that you can quickly make one change that will impact all sites.
The Linux command is "ln" to create a link, so if you were in the wwwroot directory of SiteA, you might type:
ln robots.txt /path/to/robots.txt
...where /path/to/robots.txt is your central copy.