My client is suffering from a odd problem. He blocked few pages of his site through robots.txt from getting crawled by search engines bots. But those pages are still coming in the google index. Client is not aware whether the pages are crawled before it get blocked through robots or not. But he don't want those pages to be in index.
What can be a possible solution of this problem? Also I would like to know, how a page can be indexed inspite of getting blocked through robots.txt.