Forum Moderators: open
Let's just pretend for a moment that Googlebot updates on the first of a month, fetches your page on the 2nd, and that you update it on the 3rd. The next update occurs about four weeks later, with the content from just before the update. It takes a further four weeks for another update, with uses fetch after the page was updated. Hence two months instead of one.
Of course it doesn't tend to happen quite like that, but it can. I find that newly linked site tend to need two updates, while the link itself is often in Google after one update.
some of the fellows here and myself reported heavy crawling about 2 days ago. This stopped (for me after about 1 hour) since then: silence.
One strange thing so: I had some hits from google.de and google.com on a site that is brand new and was spidered early July... pre-dance syndrome, I guess. That was about 2 hours ago for a very short period... should not be long to have another log-watch day with a nice little bottle of wine and some friends in the house ;-) My guess is within the next 24 hours the warm-up starts...
Haven't had any more Googlebot visits since yesterday, but he/she/it did take all my current pages, so I don't expect more than the odd refresh visit every couple of days until next month.
Unless this crawl is similar to the part crawl I had around 26th May just before the main update. That was followed by a deep crawl after the update.
Soooooooooooooo, nobody really knows for sure do they...the individualistic, anecdotal, off-topic responses fill the void but DO get somewhat tiresome.
In my example the page is updated just after it's fetched, near the start of the cycle. The change would be expected to show up in about two months.
In your example it's updated just before it's fetched, near the start of the cycle. The change would be expected to show up in about one month.
My other comment, about the change to a page appearing in the next update and the benefit to the page it links to appearing in the second, has to do with the case where the updated is fetched several times during the crawl. For the linked-to page to benefit it seems to need to be linked at the first fetch of the crawl. This comment is certainly anecdotal, but I've been seeing it for a while so it seemed worth a mention.
> off-topic
I think we're the ones on a side topic in this thread.
CIML, I have found the same to be true. Very important point that clarifies the topic. The only gray area is when is the first fetch? Last cycle there was a crawl shortly before and another shortly after the update started.
Are you one of those people who drives while talking on a cell phone, smoking a cigarette, drinking a soft drink, and eating a burger?
Haven't noticed any deep crawl recently.
Does PR affect how often Googlebot checks sites for updates?
I assume if a site has a high PR, and frequent updating (news feeds for example) Google would do minor updates in the index for those sites in between deep crawls?
I've found that the more time I spend with those logs the more creative ideas I've come up with for improving my site and improving my focused traffic.
Mark_Candiotti, maybe this will help you to understand the update/crawl cycle, at least the way it used to work:
==================
Example 1 - my favourite version of events. :)
* Start of Month 1. Google index is updated based on the crawl that happened about a month ago, near the beginning of Month 0.
* I see changes to my SERPs and PR, and find out what I did right and what I did wrong with last month's changes. That lets me make some educated guesses on what to change for the next index update. I use this time to fine-tune my site, and upload a number of new, improved, and shining pages.
* 5 days later. The new crawl starts, and Googlebot comes getting my newly altered pages.
* End of Month 1. Google index is updated with the crawl that happened just under a month ago, at the beginning of Month 1.
Result - I get to see the results of my updates in a month.
==================
Example 2 - Bad Google! (j/k, that's from an SEO perspective. :)
* Start of Month 1. The new crawl starts before the index is updated, and Googlebot comes getting my pages. I haven't seen the results of the latest update yet, so I haven't prepared any changes.
* A few days later. Google index is updated based on the crawl that happened about a month ago, near the beginning of Month 0.
* Now I get to see what changes in my SERPs have occurred, and I make changes. I have to wait until the crawl in Month 2 has completed (probably about a month away) before Google will pick up the newly updated pages, and the index update from that crawl will only happen about another month after that.
* End of Month 1. Google index is updated with the crawl that happened just under a month ago, at the beginning of Month 1.
Result - I get to see the results of my updates in about two months.
==================
I've not mentioned the effect of any mid-cycle updates to keep things simple.
Huppy99, Smokin, I get visits from Googlebot generally twice a week with requests for robots.txt and my index page, perhaps because this is the only page that changes regularly.
As for crawl order, I think I'm a high PR5, and I tend to get crawled later than many of the higher PR websites (relative to me) belonging to members on this board. I've also noticed that lower PR sites tend to get crawled after me. I'm not sure there's a hard and fast rule here, and this month Googlebot's main deep crawl seem to be different, if that is in fact what I had this week. Anyone remember the May 26 (AFAIK that was the date) deep crawl that affected only some sites here? Maybe the big crawl is still coming after the update, and this week's crawl is like the May 26 one.