Forum Moderators: Robert Charlton & goodroi
Google refreshes the cache on their own schedule, based on their re-spidering of your page. The cache doesn't necessarily refresh every time a url gets spidered, but refreshing does depend on that.
There's nothing that the site owner can do directly to make a cache refresh happen. But you can develop a site that Google wants to spider more frequently, one that attracts many good backlnks and good traffic for example.
The crawl team has their own algorithm to set the "crawl-budget" and it's pretty complex. Clearly, have updated pages when googlebot stops by is a help. To that end, be sure the server responds accurately to If-modified-since requests.
An xml sitemap can help, and if your site is a news/blog style CMS, then pinging Google whenever the RSS feed is updated can also help. There's a host of factors involved, and they vary witih the type of site.