About 90 pages of our 500 page site are aurrently in Google's index for a while now. Are newer pages (which are received Googlebot visits) being added to the index in this present state of Google, or will we have to wait for a traditional "update"?
Msg#: 15776 posted 10:22 am on Aug 4, 2003 (gmt 0)
I'm with Marcia on the description. Google does add pages each and every day to its index, and has done so for ages.
The daily adds were usually a product of what was known as freshbot. Pages got loaded for a day or so and then appeared to drop out, and didn't really stick in the index until the deepbot had appeared and the dance happened.
GG stated way back to expect freshbot to start behaving like deepbot and I think that's happening now. People are referring to the new bot as freshdeepbot, but I'm sure that moniker will disappear once we're used to the new activity.
There are numerous posts here saying that such and such a site appeared within 2 or 3 days of publishing, and these pages have stuck around, so in otherwords freshbot is acting like the old deepbot and updating their index constantly.
i still disagree it did NOT add pages each and every day - i.e. 31 days out of 31 or 30 out of 30! During updates that wasn't the case as it wasn't the case prior to them Freshbot data went in and came out again before the update.
And now in this supposed new era data is NOT added i.e. visible everyday. I've gone through shed loads of results!
Our site was recently down for a day whilst moving servers with pages redirected to a note saying "Under maintenance". Google spidered us that day, and after about a week our number of indexed pages went from 13,000 to 112, with the main result being "Under maintenance". For some reason not all our pages were removed; maybe we came back up before Google had finished spidering that one page... But whatever, since then I've checked the number of pages listed from our site every day, and it has most definitely been going up by roughly 50 a day. googlebot's been pretty busy so I'm hoping for a big jump at some point - fingers crossed.
Have to say though, if I was a web robot with 4 billion pages to check, seeing a site go from 13k+ pages to 1 would probably make me suspicious. My guess is that we were downgraded as a site somehow and are just going to have to sit and wait until Google's AI starts to trust us again.