Not in my experience. The crawl team has their own algorithm for frequency, and the algo takes the xml settings as a suggestion, but it still makes its own decision.
In terms of ranking, there are times (like now) where it seems that brand new urls do amazingly well out of the box. It's possible that updated pages are also getting such a boost, but I haven't noticed that so far. Mostly, Google expects some kinds of pages to update frequently (home page of a newspaper), yet it seems that other types could actually get penalized for "playing around" too much (the actual news story after it's been published.) Google has a lot of data from which they can make such decisions.
I wouldn't want to start a rush for the "golden ring of ranking" by giving the impression that high <changefreq> settings or meaningless and frequent page changes are the secret key to better ranking.
If anyone wants to do a study on this, I'd be interested in the results. But I would also advise using disposable domains ;)