We have a portal with more than 1,00,000 html pages. Almost all sections are crawled by robots. However, all pages are not crawled and hence, many of the pages are leftout. Hence, I want to give different revisit instruction for different section so that over a period of time all pages are crawled. I am not very knowledgeable for robot.txt. Can anybody help me for <b>revisit instruction</B> in robots.txt or anything which ensures <b>all pages are indexed</b>.