youfoundjake - 3:17 am on Sep 12, 2006 (gmt 0)
Welcome to webmasterworld.
From what little I understand from reading here at WebmasterWorld, it only takes one inbound link in order for the bots to find your site and start indexing it.
Having a sitemap definitely helps, but the most important factor is that your site, built primarily for the human visitors, can be spidered by the bots as well with a very easily navigational layout pointing deeper and deeper into your content.
Having Adsense and Adwords does not help get your site index any faster, and as you look over in the Google news forum, you will see alot of posts about sites getting indexed, then dropping 95% of the pages, only to come back another month later.
So what to take away from all this is to have at least 1 inbound link, and make the navigatation logical and easy to travel for bots and humans alike.
If you google spider simulator, there will be a list of sites that will look at your site like a bot and you'll see how they see your site.
You can also try an all text browser like lynx, thats' the closest you'll get to finding out how well the bots crawl your site, if you can do it in lynx, so can they.
Again, welcome to WebmasterWorld.