OK, here is the question.
We have thousands of user generated pages and have some sections that requires login. The URLs that are on these sections contain a specific word like: "youhavetologin". I would like to stop robots from crawling these pages because they just eat my bandwidth. Obviously, URLs that contain the "youhavetologin" phrase are just simply login forms (the same login form for every page).
How to "ask" robots to not waste their time and my bandwidth and do not crawl URLs like this?
www.mysite.com/dir1/dir2/dir3/youhavetologin/
I tried this way:
User-agent: *
Disallow: youhavetologin
but googlebot seems to not follow this. How to do it properly?