|Google Fresh Listings|
According to Google [google.com]:
|In addition, Google refreshes millions of web pages every day to ensure that Google users have access to the most current information. |
Google fetch pages listed in the Open Directory and Yahoo!, along with some high PageRank pages, on roughly a daily basis. If the content is different from the previous fetch, then a 'fresh' listing may be given temporarily instead of the standard listing from the last full Google Update [webmasterworld.com].
These 'fresh' listings can be identified by a date, in green, alongside the page's URL in the search results.
Webmasters are often confused when the 'fresh' listing expires. If a new 'fresh' listing is not given (because the page wasn't fetched that day or because it hasn't been updated in the last day or so) then the listing reverts to the one from the last full update. If the page was not included in the last full update then it disappears, only to reappear if it is 'fresh' listed. In times of Google paranoia, this can look like a penalty.
The coming and going of 'fresh' listings is distinct from the Everflux [webmasterworld.com], which is a description of general search result changes during the Google cycle.
Google certainly doesn't look at my small PR 4 site every day - and it's in DMOZ twice! I get a visit from google about once a month, when it comes and spiders all 24 pages of my site.
Anyone know why it wouldn't visit more frequently?
I was a bit confused about how the fresh tag and the everflux were related so thank you for describing it so nicely.
Good timing too because I just got my first fresh tag listing today :) Since these pages aren't listed in dmoz or yahoo yet and are only pr5 (and one not even indexed yet), I was pleasantly surprised :)
One thing I've noticed with Fresh Dates this update is the fall back cache. In the previous two updates, we see a period after the dance where the Fresh Dates appear for a couple of weeks. When the Fresh Date is not there, the cached version was the page state during the last crawl.
This update though has changed. Now when the Fresh Date is not there, the cached page appears to be the one that was showing based on the last Fresh Date. This may not be happening with everyone, but, I see it happening with a couple of sites that I manage. Both of those sites do have high PR and Google does visit regularly. Even so, the last two updates did not exhibit this behavior. When Fresh Dates were not there, the cache was based on the last crawl date, not the last Fresh Date. Did that make sense?
I second that motion, pageoneresults!
I would add that
(1) some of my pages would fluctuate from a 2-3 month old cached copy to a totally fresh one. Now it's always the latest copy in the cache.
(2) I'm noticing that Google is now giving fresh tags to pages that are linked directly from high page rank sites even if they are on a different domain. For example, a page rank 7 site with a direct link to a forum post on a PR 4 site. Previously, I wasn't seeing Google regresh exterior links but now they seem to be doing that.
(3) There seems to be a pattern: 2 days with lots of fresh tags throughout the Google index and 2 days with no fresh tags to be found anywhere.
|[quote]If the content is different from the previous fetch, then a 'fresh' listing may be given temporarily instead of the standard listing from the last full Google Update.[/quote]|
Ciml, I am not sure I understand this point. Are you saying that the content has to be [b]different[/b] on the page to be granted a date-stamp plus short-term new cache/index?
My own site experience says this would not be the case, but you probably mean something else.
Or take [b]His Royal Freshness[/b] - Bill's own Microsoft - 3,030 Fresh pages with 4 september (24.100 Fresh pages had the 3 september stamp this morning), according to http://www.researchbuzz.com/toolbox/goofresh.shtml (using the last 7 days option), I would doubt these to be all changed recently?
BTW, I think that after Microsoft.com, Usatoday and News.bbc.co.uk (16.000 pages this morning with 3 september) would be follow-ups on the quantity of Freshness pages..but maybe there are even Fresher sites around..
> One thing I've noticed with Fresh Dates this update is the fall back cache. In the previous two updates, we see a period after the dance where the Fresh Dates appear for a couple of weeks. When the Fresh Date is not there, the cached version was the page state during the last crawl.
> This update though has changed. Now when the Fresh Date is not there, the cached page appears to be the one that was showing based on the last Fresh Date.
This sounds like a major (and welcome) change. What you and dvduval are experiencing is a much better approach, but presumably takes some extra resourcres.
Maybe this thread should be "Google Fresh Listings Now Smarter"?
> Are you saying that the content has to be different on the page to be granted a date-stamp plus short-term new cache/index?[/small]
Yes, that's what I meant. I have some daily-crawled pages. They get the dates only when they change.
mortalfrog, I don't know why you don't get the daily crawl. Every page I've checked that's in DMoz gets fetched often.
|The coming and going of 'fresh' listings is distinct from the Everflux, which is a description of general search result changes during the Google cycle. |
But the coming and going of "fresh" listings is the strongest effect on search result changes?
At least if NFFC's description [webmasterworld.com] is taken into account on the earlier mentioned millions of pages Google refreshes every day (if content changes).
If I add a distinct new word on my regularly freshly crawled pages it effects (the number of) search results for a search query on that word.
|Yes, that's what I meant. I have some daily-crawled pages. They get the dates only when they change |
What are other peoples experiences here?
Surely this content change is not a binding factor? I now have a date stamp on a page for which I am 99% sure it has not been changed for weeks.
I also still doubt Microsoft changing the content of 24.000 pages in a few days..recurringly every few days..?
I am just getting freshly confused here..;)
PG1R - see the same thing after this months update?
(was just trying to come to terms with it's significance)
My pages seem to be freshed regardless of whether they've been changed, at least for each new date. They were changed since last major crawl however so maybe when they're fetched they are compared to that version of the page. That wouldn't seem to make sense given pageoneresults's experiences.
What would this mean for the internet infratsructure if everyone suddenly realizes that they will have better position in the serps if they upload new content regularly. How will Gbody handle it if she has to crawl and slither over the entire web everyday. It also seems the two could compound each other.
Google's toolbar is also related with the freshness checks. At least that's what I think after I logged some of its PageRank requests they have something like &freshness_check=3Bzlz2-sa4nsRQLrNlU2n.
It's probably a hash Google uses to determine if the page looks to be changed; and if it does they use a different more sophisticated algo to check for sure.
I know that because I read that after the August update Google have started to show the latest version of pages that have changed or the update-time page if it hasn't. My pages haven't changed much (but they are dynamic so the first pass of the freshness check must be positive), Google shows fresh copies only for a couple of days.