Welcome to WebmasterWorld Guest from 23.20.241.155

Message Too Old, No Replies

Sitemaps - does Google favor higher changefreq settings?

     
1:10 am on Mar 12, 2008 (gmt 0)

10+ Year Member



I have several sites; a few of these sites' log files seem to show more visits from Gbot to websites who's content updates more frequently.

I use Google Sitemaps, and it also appears that Google is visiting the sites with "Daily" or "Weekly" settings for <changefreq> more.

Moreover, it definitely appears that my websites that update more often are rising in SERPs faster and more consistently than those with mostly static pages..

Is there a correlation to Gbot visits and change frequency?
And is anyone else seeing sites with more fresh content faring better in rankings .. ?
.

4:39 am on Mar 12, 2008 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Not in my experience. The crawl team has their own algorithm for frequency, and the algo takes the xml settings as a suggestion, but it still makes its own decision.

In terms of ranking, there are times (like now) where it seems that brand new urls do amazingly well out of the box. It's possible that updated pages are also getting such a boost, but I haven't noticed that so far. Mostly, Google expects some kinds of pages to update frequently (home page of a newspaper), yet it seems that other types could actually get penalized for "playing around" too much (the actual news story after it's been published.) Google has a lot of data from which they can make such decisions.

I wouldn't want to start a rush for the "golden ring of ranking" by giving the impression that high <changefreq> settings or meaningless and frequent page changes are the secret key to better ranking.

If anyone wants to do a study on this, I'd be interested in the results. But I would also advise using disposable domains ;)

 

Featured Threads

Hot Threads This Week

Hot Threads This Month