I have two questions for you. 1. How to remove pages that are wrongly indexed by Google? 2. Making a folder password protected, if I don't want Google to index any page of that folder, Is it a good technique? What about the session? If there is a session applied, Can google crawl the page after session gets over?
Do the pages still exist? Add the noindex meta tag to each and they will be de-listed. If you use robots.txt they will continue to be listed as URL-only entries.
Are the pages gone from the website? Let those URL return a 404 or 410 and they will be delisted very quickly.
If you put listed pages behind a password, Google simply drops those pages into Supplemental and continues to show them almost forever.
Session IDs cause a site to be indexed with infinite Duplicate Content. Do not let bots see URLs with session IDs in them. If possible only use Session IDs for users after they have logged in. Search engines cannot log in.