Forum Moderators: open
I have always been a big Google fan and my admiration remains in tact ... but I am concerned that the recent displays of inconsistent update scheduling may have detrimental effects on the perception of Google's stability for webmasters world wide. I realize that it has only been a few months of erratic behaviour, but radical changes send up a red flag for me and signals that perhaps, like DMOZ, they may be experiencing technical problems which could drag on indefinitely. This isn't an unreasonable perception as we have all seen it happen to major search engines such as Alta Vista and Excite (amongst others) in the past couple of years.
We've also all noted that the Google team have been busy "tweaking" the algorithm substantially from month to month of late, producing glaring changes in the PR of various sites. They go up and down, links are lost and gained. But through it all, search results don't seem to change too drastically ... and inquiries don't seem to be affected to a large extent. This has provided a certain sense of security for us all. But is this a false security?
I have this nagging feeling in the pit of my stomach that something is wrong at Google. I can't explain it ... its just there.
The only other time this happened to me was when I "felt" there might be a problem with Inktomi. Certain things began to happen (at canada.com) and although I was assured by many in the know that there was nothing at all wrong ... it turned out in the end that I was absolutely right. It was, in fact, the beginning of what was a very major blip which took Inktomi nearly two years to recover from. In my opinion, they have never really recovered from that "blip" and it left a gaping hole in internet search which allowed Google to ride into the fray and pen the deal with Yahoo, which nobody can argue, kick started their domination as "the" internet search provider.
I've done little to my site since the last update and none of what I did was to try to advance my ranking in the SERPS ... so I can honestly say that I have no reason to have the jitters. My feelings are inexplicable I admit, and I apologize to those of you whom I may have managed to upset more so than you already might be. I just have that same nagging feeling I had when, what I refer to as the great Inktomi crash of 2001, took place. Its very worrisome.
My question is ... do any of the senior members and mods of WebmasterWorld have any of the same concerns? I have good reason to trust my gut instincts ... but (obviously) none of you do. I'd like to know if any of you have raised an eyebrow more than once in the past few months in regards to Google's update behaviour? If so, what are your concerns and why?
My apologies to GoogleGuy and those who feel this post may provoke a panic situation. My intentions are pure and I simply want to know if the senior members and mods feel that there may be something up at Google? If I were a member of the "subscribers forum", I would save this question for that arena. Alas, not yet ... but soon I hope! :)
Many webmasters are in the power user position of being able to influence many people's decisions of how they are going to search. I don't think they are the most influential group, but they are up there.
If you want a consistant update cycle, what would you be willing to give up to get it?
Should Google skip QA testing and push out whatever they have on the due date?
Should they stop trying to increase the number of pages crawled? Stop trying to crawl dynamic pages better?
Should they stop trying to deal with spam?
I would personally rather wait than have Google update before they are ready. Even PhD's and super hackers can produce buggy code sometimes. I would rather that they catch as many as pssible before they ship.
I sure do wish my crystal ball could show me what the internet will be like in five years.
It will be much more advanced that is for sure. Kinda like riding your bike to work, it sucks hehe... Imagine the se's in five more years!
I predict lots of changes thats for sure. Little google update delays are small details.
Don't mind some overly supportive people around here, I also have a strange ticklish feeling there's more than 1 Google guy aboard ;)
15 different robots Hits Bandwidth Last visit
Big Brother 2225 0 08 Apr 2003 - 18:00
MSIECrawler 320 337.19 KB 08 Apr 2003 - 16:11
Road Runner: The ImageScape Robot 222 882.80 KB 07 Apr 2003 - 04:18
Googlebot (Google) 55 1.27 MB 08 Apr 2003 - 18:03
LinkWalker 38 842.34 KB 06 Apr 2003 - 12:30
Alexa (IA Archiver) 36 1.07 MB 07 Apr 2003 - 15:23
Inktomi Slurp 26 528.13 KB 08 Apr 2003 - 10:52
Unknown robot (identified by 'crawl') 16 413.29 KB 08 Apr 2003 - 12:14
InfoSeek Robot 1.0 5 128.98 KB 03 Apr 2003 - 07:12
ZealBot 4 0 08 Apr 2003 - 09:52
Others 7 154.99 KB
Still, I do wonder when Google's next big change will be. It's that feeling of impending doom that reminds us to keep on Googlebot's good side.
My apologies to GoogleGuy and those who feel this post may provoke a panic situation. My intentions are pure and I simply want to know if the senior members and mods feel that there may be something up at Google?
Thanks Birdman, Marcia, Googleguy, Napolean, CIML and those of you who provided sensible answers without assuming that I am "panicking, freaking out or that I have a bizarre take on things". It was simply a question. I wanted to know if anyone else was feeling the same way. Now I know! :)
The gist is that the average time between updates is 32 days with a mean standard deviation of 6 days.
Also given are all the update times
some also call it a negative spiral (read that term the first time on bill gates' first book) but here we'll just call it LABDO, what do you say?
(Lack of Anything Better To DO)
I think that explains why what we call the 'delay' is occuring. Yah, if I had a search engine with billions of listings, you can bet I would want the resaults to be quality resaults. Adding billins of new listings probably makes it take longer to test. I think we can probably expect future updates to be every 35-40 days.
Large amounts of non-strict data, i.e. data that's not clearly defined like numbers, such as a textual database, isinherently messy and maintenance hungry. And maintenance cost is almost exponentially related to size of database.
I can understand the horror of the guy in charge of googles datasets. In fact what we call great and squeecky clean he probably considers with horror, begging the GG to give him a few decades to clean it up before resuming normal timeflow.
Good job guys.
SN
Update would have happened by now...except GG has to keep coming to this thread to answer the posts. The little "red button" is located in the room opposite his computer and it is the result of the "is the google updated yet threads" that is slowing down the whole process. Since the google complex is streamlined so much, the doorways are "one way" only, meaning GG has to circle the entire building before getting back to the area. This forced exercise has resulted in numerous breaks taken by GG, in an attempt to recouperate. As a result, at this time he does not have the strenth to depress the button. <random theory>
Course we all know that it is a deceptive plan by GG to have something to discuss at the Webmasters conference:)<conspiracy theory>
Jim
(I would not mind at all if they pushed the red "get Jim paid on his pay-for-performance job now" button, though.) :)
Of course we do, GG. But you might not like it and you probably won't do what we suggest. Two pages (links) back you put anchor text in a link with corresponding words surrounding and in the page title. Link to an intermediary page and from that page link to your target page from an image with an "alt" attribute. Bingo! Only catch is, you'll have to change your page title.
This will get you a #2 out of 826K for a blank page, but may not work for a page that actually has anything on it. I haven't tested for that yet.
Notice about his comment ... GoogleGuy said
"The fresh crawl and its millions of pages per day help to find new results between crawls too.."
I think all the talk of this Grub Client may have some substance to this, something tells me there is more into GG's short comments than meets the eye. The best thing anyone in the search engine world could ever do is come up with a Grub Client that works in Google.com's terms.
If this new client updated the web daily and ongoing using Googles algo we would have something am I right?
So why not stop monthly updates and start doing major freshbots all the time?
Am I missing something, I may be a few beers to deep so do not mind me, hmmmm am I close guys/gals of WebmasterWorld? Show me some love!