We have a portal with more than 1,00,000 html pages. Almost all sections are crawled by robots. However, all pages are not crawled and hence, many of the pages are leftout. Hence, I want to give different revisit instruction for different section so that over a period of time all pages are crawled. I am not very knowledgeable for robot.txt. Can anybody help me for <b>revisit instruction</B> in robots.txt or anything which ensures <b>all pages are indexed</b>.
Thanks. However, I have observed that some pages appear and then after sometimes they appear. Because of this at any time hardly 25% to 40% of the pages are fatched. Can you advise me how to overcome this.