> 3. I don't want Google to crawl part or all of my site.
There is a standard method involving a "robots.txt" file for excluding robot crawlers. This will prevent Googlebot or other crawlers from visiting your site. Googlebot has a user-agent of "Googlebot". In addition, Googlebot understands some extensions to the robots.txt standard: Disallow patterns may include * to match any sequence of characters, and patterns may end in $ to indicate that the $ must match the end of a name. For example, to prevent Googlebot from crawling files that end in gif, you may use the following robots.txt entry:
User-Agent: Googlebot Disallow: /*.gif$
P.S. Hmmm, just realized, Google has incorrect syntax shown on that page. The "A" in Agent should not be capitalized. Shame on them! ;)