Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: open
I'm being hit very hard by google's freshbot at the moment, and going deep too. At first glance at what is currently going on with the little guys, I had to check and double check that the IP's were 64.... (they are).
It's behaviour, in terms of hard hitting and depth of crawl (it's going through the entire site) is more like the character of the old deepbot.
In fact, it's identical behaviour to deepbot the last time it crawled this site back in April.
I'm interested in hearing from others who are seeing the same.
I think that the quote above is highly revealing. Note the reference to "in the transition period for this system." This would explain a lot why Google is so broken at the moment. they are moving between systems, and not to gracefully.
As to trillianjedi, depends on what the new baby look like. ;)
"I wouldn't be surprised to see a traditional update for a little while longer. "
Could someone please translate this statement?
I think you need to put it back in its original context:-
...we're still in the transition period for this system, so I wouldn't be surprised to see a traditional update for a little while longer.
Then fix the poor English (your school teacher use to pull his hair out in English class didn't he GG?!):-
"...we're still in the transition period for this system, so don't be surprised if you do not see a traditional update for a little while longer."
Which I believe means the new PR and backlink calculations will not be ready for between 4 and 7 weeks.
That may or may not mean that a new deepcrawl will happen in between, depending on whether they have advanced their freshbot technology or not.
...they are moving between systems, and not to[o] gracefully.
Indeed. Seeing this sloppiness in action really surprises me. Google has bandwidth up the wazoo, 10,000+ dual cpu boxes, and plenty of smart people.
What suprises me is that they couldn't (or didn't bother to) figure out a way to do this "transistional update" transparently. It would seem they have the resources to have it working in the background while the usual update took place, even if that meant each process took extra time. Or at the very least, they could have had some decent freshbot activity/updates going on, as they appear to be doing now. As it stands, we have neither the usual update nor this incarnation of an update.
Maybe Joe User doesn't notice the difference. I find myself too far removed from that position to make a real judgement, but I have seen my Ink/MSN referrals increase recently. The only connection I can make is that Ink knows what's on my site as of this morning and their index is not far behind. Google did a fairly deep but very incomplete freshbot crawl on Wednesday/Thursday, but is still using an index from mid-April.
Sure, reneewood. I would expect at least another update of the form where the crawl/index cycle finishes and then data centers are updated in the traditional dance.
Good, GG. I had a few ongoing experiments that were dependent upon the normal cycle.
I sure hope there can be smoother "transistions" in the future though. This current update is no where near over but in my normal day to day searching Google has been less and less helpful. I'm sure hope that will be remedied shortly though.
And, where does this fit in terms of missing backlinks, anchor text, etc. being added back in? After these are added in? And, will this result in significant change in rankings when these missing backlinks, anchor text, etc. are brought in?
And, for those who haven't noticed, unless freshbot is also doing deepbot duties now, the deep crawl hasn't even begun yet. Which would mean that based on historical the crawl/index cycle pattern, expect at least a month before this happens, and perhaps longer.
This update sure looks over to me. Looking at SERPs where there is little changes from freshbot, they have been solid for days. The only oddity is a number of strange toolbar PR displays. However, this is just a frill. What Google cares about is what shows up on the SERPs.
Which would mean that based on historical the crawl/index cycle pattern, expect at least a month before this happens, and perhaps longer.
That's exactly what GoogleGuy is saying: "weeks not months", in other words something longer than a usual cycle.
Assume 7 weeks.
Assume 7 weeks.
Assume nothing would be my advice. Sort of like "don't promise anything"
It looks like Google is implementing a lot of new stuff in the coming months. We have no idea where the continuous update is in that list of priorities, and even if we did, priorities change and $#!+ happens.
And, this will or will not effect rankings whenever this happens? If all this means is different results using the link: and allinanchor: commands, pretty meaningless.
>I think someone else remarked that freshbot is looking more like deepbot these days.
Hmm...VERY interesting that you picked up on this. I'll take the above as a significant comment. ;)
I'd suggest people take very close note of the above by GG. If things go as he expects, this suggests a whole new way of Google updating. Up until now, other than freshbot there have been no major changes except at dance time. We may be moving into an era where between dances, there will be mini updates that can significantly change rankings as new data is brought in at that time.
"rfgdxm1, more anchor text, backlinks, spam filters, etc. should be brought in by the next update."
This is inconsistent with "After the NEXT UPDATE". I presumed from what GG wrote that he meant this would be added in at some unspecified time before the update. If not, feel free to correct me GG.
hold on everyone, GG is here as a courtesy and on his own time. He offers information as he is able and cannot be expected to spell everything out in detail. His comments over this past update have been of phenomenal value! Yes, he has the patience of a saint -or- a grandparent watching over the youngins. :)
If so (or even if an official crawl is still yet to come shortly), I'd assume that April's crawl will be scrapped. Why bother with that data when new data is being collected?