Forum Moderators: open

Message Too Old, No Replies

Is there still time?

Getting new sites into Google before update

         

allanp73

12:02 am on Feb 21, 2003 (gmt 0)

10+ Year Member



Hi,

I just built two days ago 2 new sites and was wondering if I have a chance to get them into Google before the next update. I added them as links to sites which are already on Google. Actually, today I noticed that the freshbot spidered them, but no deep crawl.
Will Google send a deep crawler if it finds new sites with a fresh bot?
How can I get these sites deep crawled in time for the next update?

jady

12:03 am on Feb 21, 2003 (gmt 0)

10+ Year Member



Doubt it, seems to have stopped deep crawl about 6 days ago here. Only freshbot hits on our high PR sites - nothing much in the logs of new sites...

Amras

12:14 am on Feb 21, 2003 (gmt 0)

10+ Year Member



I was under the impression everybody was still waiiiting for that February deep crawl.

allanp73

12:23 am on Feb 21, 2003 (gmt 0)

10+ Year Member



I was worried about this. I had a deep crawl for my other sites at the beginning of the month. I was hoping that I could get these new sites deep crawled now.
Do the fresh bot and the deep crawler work in conjunction? I wonder if the fresh bot sees new sites doesn't encourage Google to send a deep crawler to index the new site or does it wait until the next deep crawl period to do so?

Stefan

12:45 am on Feb 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I last saw the deepbot on the 16th, although freshie has been dropping by everyday.

"I wonder if the fresh bot sees new sites doesn't encourage Google to send a deep crawler to index the new site or does it wait until the next deep crawl period to do so?"

I believe it waits until the next scheduled deepcrawl. The new sites might not be in the index until the late Mar update, (i.e. after the next deepcrawl).

allanp73

12:54 am on Feb 21, 2003 (gmt 0)

10+ Year Member



Maybe I could bribe the deepcrawler. Offer it jelly beans or something ;)

Stefan

1:06 am on Feb 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Patience. The freshbot found them and it keeps very active these days. You might spend more time in the serps than out until the next update. If anything, bribe the freshbot... maybe a single malt Scotch would work.

webdevsf

1:13 am on Feb 21, 2003 (gmt 0)

10+ Year Member



Hmm, I've been getting deepcrawled by gbot for the past 2 days...

Jesse_Smith

1:56 am on Feb 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



:::Hmm, I've been getting deepcrawled by gbot for the past 2 days

Which IP address are you geting?

freshBot: 64.68.82.* bah, listed for only a few days, short dinner date.
deepcrawler: 216.239.46.* Good bot, it likes you, listed until death do you depart. So don't make him mad or it will divorce your site. You can divorce the Googlebot by using your robot.txt file. It's much cheaper and faster than going to the courts.

Marcia

2:17 am on Feb 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



allan, figure on an early March crawl and properly turning up at the update end of March. Count anything sooner as a bonus.

allanp73

2:26 am on Feb 21, 2003 (gmt 0)

10+ Year Member



I had a visit from Googlebot/2.1 (2 hits) I imagine this is the fresh bot.
My log reprts don't show the IP addresses. (I use Summary) What should I look for other than a large number of hits?

skibum

2:55 am on Feb 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



yea, you'll probably see one or more pages go in and out for a few days and maybe even rank well now and then via freshbot. It'll be the next update before it really starts to stick in the index.

webdevsf

3:44 am on Feb 21, 2003 (gmt 0)

10+ Year Member



Weird. I've been getting crawled by "freshbot" (64.68.x.x) , but it's crawled over 15000 pages today.

If that's "fresh", I wonder what "deep" is going to look like!

duckhunter

3:54 am on Feb 21, 2003 (gmt 0)

10+ Year Member



Freshbot has really dug into my site too. Today I too got a major crawl by the 64.68 addresses. It travelled the majority of my pages.