Forum Moderators: open
1) I could make all the changes at once, regardless of when Google updates the site.
2) I could make the changes gradually, over the course of a week or so to fool Google into thinking that I am continually updating the site, so it would come back to update the site more often.
I know that updating one page continually means that Google updates it more often, but could the same principle work for different pages of the same site?
However, you might be able to benefit from a similar approach to keep the pages fresh if you finish with the re-design just before the deep-crawl and want to keep the fresh page listings showing in Google until the next update three/four weeks later.
With Google, everything is page-related, not site-related. Each page has its own PageRank, and each page's freshbot revisitation schedule is set based upon how often that page changes, among other factors such as PR. The only exception to page-by-page that I know of is a domain-wide ban for spamming.
I don't know what algorithm Google uses to detect fresh pages, but I can vouch for its accuracy - It won't pay attention to minor changes or re-arranges, but it has been amazingly keen at picking up truly-new content on my sites.
Just my observations...
Jim
You could probably make all the changes at once and it wouldn't make any difference. To keep the freshbot interested you'll need to keep at least a couple of pages regularly updated; that might keep the bot looking at the rest of them.
It was like a one-two-three chain progression with homepage links in that particular case.
My thoughts Marcia;
1. most often they are the pages that get most recent new links towards them.
With me, those are the index page, the sitemap and the feedback page ;)
What the freshbot is trying to say is, hey, this page must be very topical at the moment because its got many new inbound links towards it recently.
2. Also, Google-freshbot could identify some important pages within sites (those with the highest PR and maybe links from DMOZ/Yahoo) that could potentialy have the highest chance of having authorative, new links in them to new (fresh) pages.
Freshbot can only find new content pages, if some page links to them. Biggest chance those are the highest PR pages within the site. Therefore Freshbot must frequently spider these important pages to check for these new links on these pages.