Forum Moderators: phranque
You can't do it with robots.txt. SSI includes are inserted into Web pages *before* they are served, and appear as part of the page.
You can use SSI itself to test the requestor's IP address or user-agent, and then conditionally include contents... but that is cloaking, and search engines frown on it.
Jim
[google.com...]
so I was thinking I could just do the same for .inc's...
User-agent: Googlebot
Disallow: /*.gif$
((I think I could substitute inc for gif, but I hate blundering into the unknown where the googlebot is concerned.))
... But, (just thought over JD's post) if search engines only spider .*htm* pages then they're never "seeing" includes, for as JD reminded me, they are stripped into the .*htm* pages before they're shown to the public. In which case, my worry/idea is groundless?