Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: goodroi
I've read over some of the posts about this topic. I would like to set up a robots.txt to stop and indexing of archived pages and also indexing of my images.
I have contacted my web host and know where to put the file once I create it. I also understand there is a validator and that the file needs to be in "unix."
How is all this accomplished through Frontpage?
You can make your robots.txt file in Notepad. When you're done name the file 'robots.txt' and save it in the root directory of your web.
This depends on your hosting service. If your site is small, you can usually just use your browser to view the raw log files in a secure subdirectory of your site, or download them using ftp. No special software is required. The internet explorer browser supports ftp access as well as normal browsing.
I'd suggest looking at your host's help pages, or looking in your site's "control panel" if it is set up with one.