Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: goodroi
<meta name="robots" content="all,follow">
<meta name="distribution" content="global">
<META name="revisit-after" content="10 days">
We are now creating a dynamic part of their site (i.e. changing alot, not database driven) and we want the spiders to revisit every day or even more frequently. They are PR8 so it should not be a problem, but they want to keep the rule that the spider visits the rest of the site every 10 days. The section that we want the spider to revisit more frequently will be in its own directory with a link from the home page, so the URL link on the home page will be to www.domainname.com/directory. I assume that we need the spiders to visit both the home page and all the pages within this directory in order for the directory to gain the banafit of PageRank / rating from the home page link. 2 questions:
1. Is it possible?
2. If so, what is the command?
3. Will the search engines pick up this command immediately, i.e. do the spiders look for robots.txt every time they revisit or is it just an occasional thing?
Really? That should not be in there. What you have there should be in the head of your document. Think of the robots.txt as an exclusion protocol, it wont make the spidering process any quicker
>>and we want the spiders to revisit every day or even more frequently.
You mention every ten days, well I am afraid you cant really dictate how often they come. A simple way to get crawled more regularly is to add regular content and get a few links into that section.
>> banafit of PageRank / rating from the home page link. 2 questions:
Forget page rank, you are after being crawled, if you have enough beef in your page you will rank regardless of it. :)
If you want Google and others to crawl all your pages you donít even really need to have a robots.txt file. It is best to have one though, even if it is just blank.
I would be tempted to lose the top and bottom of the two you have, but put them where they should be :). I donít know what the global one does. Never used it.
You can't (unless you pay some engines for frequent spidering). The revisit tag is completely spurious IMO.
There is no way to dictate to spiders how often you want them to crawl. They work out frequency themselves.