Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: open
4. How do you tell the difference between the deepbot and the freshbot.
The deepbot and the freshbot uses different IPs.
The Deepbot uses IPs which run from 216.*
and the Freshbot uses IPs which start with 64.*
Here's something interesting...
A site I put up about 10 days ago seems to have been included in the recent update, but has only been crawled by freshbot - the 64's.
Anyone got any experience of this happening? Or is this a fresh listing without a date? Or something else?
[edited by: Adam_C at 1:07 pm (utc) on Mar. 11, 2003]
By memory, it's the 64.68.80.* and 64.68.81.* that seem to be the "Deeps" coming in from the 64 range. Those could be wrong, though.
Few days later only 2 pages were left. Now, after the dance, we are back to 31 pages (which is still only a part of the spidered pages) but the domain still shows a graybar so I guess it was not really included in the "deep" index till now...
A few hours ago 216.239.46.* was here for the first time, so I guess all will be fine in the end :)
The pages went up on Jan 19th, and 64.68.82.* picked them up a couple of days later.
I think Grumpus is on the right lines with his sugestion of deep 64.68's, or a possible gap being bridged between the deep and fresh bots.
<edit reason> Jan update, not Feb - i.e. end of Jan
[edited by: Adam_C at 4:01 pm (utc) on Mar. 11, 2003]
but in my case with a listed sites with some totally new pages:
Fresh listings without date and no-backlinks.
Has this not been in place for a while though, that is, that Fresh pages can still apear in SERPS without a date tag?
P.S. The 64.68.80 and 64.68.81 are still a bit mysterious. I THINK they are deep crawlers, but can't tell for certain as there were several hits in that range very late last month and early this month...
I am wondering how long it may take that we understand, that new listings are better than the same over almost four weeks - not absolute for us, but for the users of Google!
If we all work cleanly with all what we learned here in WebmasterWorld and other ressources, does it matter, which spider is hitting our server - the main point is, that our pages are available. And if we do it correctly, we occur within the top listings. One and the same whether with help from the freshbot or deepbot.
The only point might be, that some pages are always in the index (deep) and some not (fresh).
I assume, that this īd be changed soon, e.g. the pages spidered by the freshbot are more stabile in the serps.