In Webmaster Tools there's a bunch of business about crawl rate and the rate your site is being crawled and access by the Google bots.
It's even got some gizmo where you can turn "down" the crawl rate and seems to put the orders of magnitude in "seconds".
Ok -- that's all fine and good -- but then why is it, if my site is CONSTANTLY being "crawled" by "bots" -- does Google utterly fail to update it in the listings?
My site has some old, putrid, garbage title and description that I changed days ago. If this stupid bot is crawling constantly, why doesn't it change faster?
Something changes on the site, and it's not reflected in the listings for days (sometimes weeks or months) yet the stupid thing is being "crawled" hundreds of times a day?
Again, I'm new to all this, but man, it's frustrating how illogical it all is.