Forum Moderators: open
I have a site which is indexed every day by the googlebot. But I want that the site is indexed one time in the week. (after 7 days).
I tried the metatag <META NAME="REVISIT-AFTER" CONTENT="7 Days"> , but that doesn't work. He will visit every day. In a robots.txt it is possible to block the googlebot, but I don't think you can tell the Googlebot come back after seven days.
Are there scripts for?
GeniusGeri
P.S. Sorry for my bad english. I hope you understand the question.
Googlebot visits some pages more often than others. PageRank is known to be involved, along with inclusion in the Open Directory or Yahoo!.
I don't know of any other way to alter the crawl frequency and I am yet to find a public cralwer that listens to META revisit-after.
I'm not sure why your daily indexing poses a problem, but I suppose you could cloak the page and give the spider just the weekly version you want to stick with. Welcome to WebmasterWorld!
Thx for the answers. I'll explain the reason of my question: At the moment I have 80 domains which are all indexed by the googlebot every day. Ofcourse, I am happy with this, but my own server is now very, very busy. That's the reason I want to let the Googlebot index one time a week.
Can I conclude that there aren't scripts for it?
I tried the next thing too: I made a robots.txt and blocked the googlebot. After a week I deleted the robots.txt and the googlebot came back immediately. But I don't think this is a good way.
GeniusGeri