Forum Moderators: open

Message Too Old, No Replies

"DeepBot 2" aka DeepFreshBot small FAQ

our new agent in town.

         

AthlonInside

8:39 am on Jun 20, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Got this question from zafile in the update thread. So I would like to discuss a bit about my observation with my small FAQ. Since this is more on the ongoing GoogleBot then the dedicated Esmeralda update. I have put it in a new thread.


Which backlink data is being applied to current Esmeralda search results:
a. pre-dominic
b. post-dominic
c. mid-esmeralda
Share your guess with us!

the answer is

d. PRE and POST dominic (mid esmeralda is impossible)

---------------------

Q: Traditional deep crawl vs fresh crawl?
A: Deep crawl uses the data to build a new index (we call it the update). New index is used for calculating new back links, PRs ... Freshbot doesn't do much but bring some fressness to the index. It can affect the site ranking slightly by changing some on-page SEO factors.

Q: So what happens to Esmeralda 'deep' crawl?
A: The deep crawl is now done with fresh bot. Our familiar Deep bot is retired.

Q: How the *traditional* Deep Bot works towards the new index?
A: After an update is finish (usually few days later), Google sends out their agents (the DeepBot reloaded) to crawl the entire web. It takes something like 1-2 weeks to complete. Although most sites will meet it for only 1 or 2 days. One month later, this data is used as the 'updated index'. Data in the index are almost 1 month old.

Q: What's the different of the retired 'deep' bot with the new hired 'DeepBot 2'?
A: 'Deep' Data (use for new index - calculating backlinks, PR...) are no longer crawl once by the retired deep bot. 'Deep' data are crawled in a period of time (the freshbot characteristics). The pages that make it into the new index could be crawl only few days before the update, weeks before it or even not crawl (uses the version from the old update)!

Q: What's the advantages of the new 'DeepBot 2'?
A: Updates no longer bringing in data that is always 1 month old! You can expect new sites added one week before the update to enter the new index. You can also see your backlinks added last week appear in the index without the need to wait for 1-2 months (traditional pattern). Thus, google can bring in better SERPs to their visitors.

Q: What is the disadvatages of the new 'DeepBot 2'?
A: If you can expect to see backlinks added 2 weeks ago to show up, you can also expect backlinks added 2 months ago are not yet counted (especially those hidding deep). Some pages might not even be updated. They will used the old version instead (from the last update). This disadvatages applies more to webmaster. Google visitors has nothing to worry about it.

Q: 'DeepBot 2' and the fressness
A: Can you guess what pages will be fresher and what pages might not be updated? It follow the freshbot chracteristics. Main page and subpages 1 level down will usually fresher than those pages hidden few levels down. I also believe PR will determine which page need to be fresher and how deep it should go. It is always wiser to make an PR8 main page fresher than a links page with PR4. PR determines the importants of a page.

Q: Should we like the new 'DeepBot 2'?
A: It is a Game by Google and you need to play the game according to their rules. Instead of 'like' or 'hate' it is always better to adapt yourself to the new Google. Keep youself up to date. This is what I do with WebmasterWorld.

p/s Above are obervation of mine to my sites and my competitors sites and my non-competitors sites and ...