Forum Moderators: open

Message Too Old, No Replies

Question about links and timing

when is the best time of the month

         

mfishy

12:51 am on Jan 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If a site links to your page during the crawl (after the update) and later that month removes the links, will they still count for the next update.

In other words, after the crawl when Google caches pages for the next update, are those the links we will see regardless of changes throughout the month?

fathom

12:55 am on Jan 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It's iffy to say one way or the other. The best time to post is immediately.

The best chance to ensure in the next update - is posting during the update - since googlebot and minty cease to crawl during this time period.

In other words, after the crawl when Google caches pages for the next update, are those the links we will see regardless of changes throughout the month?

Yes except when minty does a refresh crawl through you but this is only temporary.

2_much

2:48 am on Jan 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've been a bit confused about this issue.

Apart from the freshbot, I am seeing a 2 month lag for results to show. So something that is crawled today won't show until March.

I guess they use the freshbot as an offset for this. Is this accurate with other people's observations?

I am referring to links showing up in the link to's - it seems to take 2 months.

fathom

3:13 am on Jan 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hmmm... 2_much - immediately following each update I change at least one title per web site. Most often they appear the follow update.

I have notice on occasion that Googlebot skips (or possibly misses a change) but not too often.

I suspect the difference in observation might be the frequency at which Googlebot returns.

mfishy

12:30 pm on Jan 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Just, to clarify, as I'm a bit slow, theoretically, one could have hundreds of links pointed to their page during the deep crawl after the update and none for the next thirty days and they would all show the following month?

fathom

1:45 pm on Jan 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



possibly.

ciml

1:54 pm on Jan 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I find that (Deep) Google visits most of our pages about twice per month. I haven't worked out whether it's the first or second visit that counts.

Yes 2_much, I'd say that the Fresh listings offset the delay rather well.

Brian

3:13 pm on Jan 8, 2003 (gmt 0)

10+ Year Member



It seems to me that pages which appear and then disappear from the index will reappear again in the fullness of time. Sadly, that time can be almost two months if you put them up during the crawl. This happens to me frequently. Once you pass through a full dance everything will stabilize. It's a bit of a tease though when something you really like zaps into the public consciousness, you get a bunch of hits, and then is goes into hiding for weeks.

Jabzebedwa

4:40 pm on Jan 8, 2003 (gmt 0)

10+ Year Member



I was going to do a new post, but this one touches directly on my question.

Over the last month, (December) I got a bunch of new incomming links. They do not show up in the backlinks of the Google Bar now after the most recent Google update and my position actually slid down a couple places. Is it actually so that the links we do this month don't register until 2 months later?
Jabzebedwa

ciml

4:57 pm on Jan 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Welcome to WebmasterWorld, Jabzebedwa.

You should expect to wait one or two updates [webmasterworld.com] for backliks to appear. New pages in Google Fresh listings seem to get a ranking that's heavily influenced by their backlinks, though.

mikeputnam

5:32 pm on Jan 8, 2003 (gmt 0)

10+ Year Member



Here is my log sorted by requested page. Notice how each page is requested twice. I am wondering if this is done to determine which sites are added to the freshbot list. Ex: Google thinks, "If a percentage of the requested pages are updated in the time it takes for the second bot to come, I'll add the pages/site to the freshbot list because this site seems to keep minty-fresh content"

What do you think?

crawl4.googlebot.com[07/Jan/2003:08:41:47 -0500]GET / HTTP/1.0 "-" 200 6316
crawl1.googlebot.com[07/Jan/2003:17:36:45 -0500]GET / HTTP/1.0 "-" 200 6316
crawl5.googlebot.com[08/Jan/2003:01:41:31 -0500]GET /amy/index.php HTTP/1.0 "-" 200 5214
crawl8.googlebot.com[08/Jan/2003:06:28:40 -0500]GET /amy/index.php HTTP/1.0 "-" 200 5214
crawl9.googlebot.com[08/Jan/2003:04:43:57 -0500]GET /family/index.php HTTP/1.0 "-" 200 4942
crawl9.googlebot.com[08/Jan/2003:11:10:07 -0500]GET /family/index.php HTTP/1.0 "-" 200 4942
crawl1.googlebot.com[08/Jan/2003:03:54:52 -0500]GET /geocaching/index.php HTTP/1.0 "-" 200 5475
crawl7.googlebot.com[08/Jan/2003:09:19:11 -0500]GET /geocaching/index.php HTTP/1.0 "-" 200 5475
crawl4.googlebot.com[08/Jan/2003:01:25:03 -0500]GET /house/index.php HTTP/1.0 "-" 200 4521
crawl2.googlebot.com[08/Jan/2003:09:19:10 -0500]GET /house/index.php HTTP/1.0 "-" 200 4521
crawl5.googlebot.com[08/Jan/2003:09:43:24 -0500]GET /house/pictures.php HTTP/1.0 "-" 200 5873
crawl7.googlebot.com[08/Jan/2003:05:03:34 -0500]GET /ian/index.php HTTP/1.0 "-" 200 6107
crawl4.googlebot.com[08/Jan/2003:11:07:06 -0500]GET /ian/index.php HTTP/1.0 "-" 200 6107
crawl3.googlebot.com[07/Jan/2003:22:04:27 -0500]GET /lamaze/index.php HTTP/1.0 "-" 200 5130
crawl7.googlebot.com[08/Jan/2003:11:06:02 -0500]GET /lamaze/index.php HTTP/1.0 "-" 200 5130
crawl4.googlebot.com[08/Jan/2003:08:15:02 -0500]GET /mike/aboutsite.php HTTP/1.0 "-" 200 8265
crawl4.googlebot.com[08/Jan/2003:11:09:38 -0500]GET /mike/aboutsite.php HTTP/1.0 "-" 200 8265
crawl1.googlebot.com[08/Jan/2003:04:39:42 -0500]GET /mike/index.php HTTP/1.0 "-" 200 6926
crawl4.googlebot.com[08/Jan/2003:06:50:56 -0500]GET /mike/index.php HTTP/1.0 "-" 200 6926
crawl7.googlebot.com[08/Jan/2003:04:39:56 -0500]GET /putnamfest/index.php HTTP/1.0 "-" 200 5240
crawl4.googlebot.com[08/Jan/2003:07:22:49 -0500]GET /putnamfest/index.php HTTP/1.0 "-" 200 5240
crawl7.googlebot.com[08/Jan/2003:01:51:51 -0500]GET /rendezvous/index.php HTTP/1.0 "-" 200 4528
crawl1.googlebot.com[08/Jan/2003:07:57:19 -0500]GET /rendezvous/index.php HTTP/1.0 "-" 200 4528
crawl4.googlebot.com[07/Jan/2003:08:41:42 -0500]GET /robots.txt HTTP/1.0 "-" 200 30
crawl1.googlebot.com[07/Jan/2003:17:36:42 -0500]GET /robots.txt HTTP/1.0 "-" 200 30
crawl7.googlebot.com[08/Jan/2003:11:06:01 -0500]GET /robots.txt HTTP/1.0 "-" 200 30

Jabzebedwa

5:33 pm on Jan 8, 2003 (gmt 0)

10+ Year Member



Thanks Ciml,
Nice to see the hospitality here in this forum!
I am brand spanking new here and would like to know about the abbreviations used such as "serp" and others. There is quite a bit of insider terminology here. How can one learn the lingo?
Thanks again

VictorE

5:38 pm on Jan 8, 2003 (gmt 0)

10+ Year Member



Jabzebedwa,

Welcome to WebmasterWorld. Brett has a good glossary at the link below:

[searchengineworld.com ]

snowfox121

5:42 pm on Jan 8, 2003 (gmt 0)

10+ Year Member



Are links currently being "fresh dated?" i have several pages in the SERPs that usually show a fresh date, but for the last few days do not show any date.

If the fresh dating is still being used (other seeing it), it means that Google has changed its method of fresh dating. In other words, perhaps determining how much of the page has changed, or how recently.

My changes tend to be "frequent but irregular" and often involve changes affecting only 10% of the info on a page.

VictorE

5:48 pm on Jan 8, 2003 (gmt 0)

10+ Year Member



snowfox121,

I haven't seen any fresh tags in 24 hours. However, I see that Freshbot was on my site this morning. I assume Freshtags will be back w/in the next 24 hours or so.

Vic

jomaxx

6:52 pm on Jan 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



PR is only recalculated during the main update, so links found at that time will benefit your site throughout the month.

The problem is that it's hard to know when Googlebot has crawled the other site (unless you're manipulating links between your own sites). Plus, like ciml, I see a couple of deep crawls per month, so it's hard to know when exactly your pages will be picked up for the next dance.

Finally, it's possible that Freshbot crawls throughout the month may feed into the main database - so that if the crawl is done on the 12th and Freshbot shows up again on the 20th, the fresher version of the page could be used.

That's just a conjecture, but it's what I would do if I were Google - the crawl is usually done weeks in advance of the dance, and I'd be surprised if it takes THAT much time to build the next database. I think they build some extra time into the schedule, for example so that they can revisit sites that were down on the first visit.

mfishy

7:25 pm on Jan 8, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I miss the freshbot. I know it has been around a bit so far this month, but I am eagerly awaiting the near daily fresh listings of next month.