Forum Moderators: goodroi
User-agent: Googlebot
Disallow: /?tag
Disallow: /?tag1
Disallow: /?tag2
Disallow:?
it validates with robots.txt checker, but will it do me any good? (or bad for that matter). I want google to ignore www.domain.com?blah since it sees it as a different page, eventhoguh the file is the same and can seen as dupes