homepage Welcome to WebmasterWorld Guest from 54.204.141.129
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
What is it about Dominic to make it a 2 month update? (not a rant)
Floating and soliciting a couple of theories
Clark




msg:155022
 11:27 am on Jun 6, 2003 (gmt 0)

First, a note to mods. I (obviously) think this is a topic meriting it's own thread. However, I know the Dominic craziness has you folks on alert to keep forum 3 sane. This is not a "down with google", "google is stupid", or "Dominic Craziness" thread. I'm trying to get at some specific reasons for Dominic being a "slow update". If you don't agree about this being a topic by itself and choose to take action, then I'd appreciate if you could merge this thread with another rather than kill it. Thanks.

While GG was nice enough to give us a heads up on what to expect with Dominic in terms of time frame, and what would happen when, I don't think we really got an idea of why this update is taking 2 months. And we can't expect to have GG just tell us G's secrets. But that doesn't mean we can't speculate.

Q1. PR Calculation?
A1. I don't think PR takes this long to calculate. Even if they have a completely different way of doing it. So that's out.

Q2. New algo's?
A2.Well why wouldn't they just test it on a test box and keep going like normal in the production environment?

Q3. New Spam filters?
A3. Same answer as 2.

Q4. New system of rolling updates?
A4. This was my assumption until today. But then I realized, the freshbot is acting a little more like a deepbot, true, but it hasn't really gone after new content and again, why couldn't they use a test environment for this?

Q5. Transitioning to new rolling updates, new PR calculation, new algos and new spam filters?
A5. Perhaps, but again, why not use a test environment?

Do you see a pattern to these answers? Correct. It all points to the question, Why isn't this whole update in a test environment?

I don't think G screwed up. It's intentional. If not, then GG wouldn't have warned us. I think, though it's purely speculation, that they have done testing in a test environment, they felt it worked well enough to go live. But they know the testing and tweaking could not be done completely and properly just with the test environment. They needed the resources of all those PCs to test with full data and full resources in order to get it right. They just couldn't go any further with their test resources. So we are essentially witnessing a live alpha/beta test for that reason.

 

SEO practioner




msg:155023
 11:56 am on Jun 6, 2003 (gmt 0)

Clark, it does make a bit of sense, in light of what we have been seeing since the start of Dominic.

I guess when its all over, the real light will eventually come out of its hiding and we will all have a clearer picture of the new environment.

mfishy




msg:155024
 12:04 pm on Jun 6, 2003 (gmt 0)

Most of it makes little sense.

I figure they have had more difficulties than anticipated. We have only had 2 "real updates" this year and it's June. I don't think we will have a clear view of what has occured until the next update, which GG claims will be of the traditional sort. This may not occur for sometime, so patience will probably wear really thin around here. :)

In regards to your question as to why they decided to put out an old index/new algo, it may have something to do with agreements with their partners like Yahoo, who might expect a new index once in a while.

jrobbio




msg:155025
 12:14 pm on Jun 6, 2003 (gmt 0)

Did you read Chris D's summarisation here: [webmasterworld.com ], especially in light of questions 2 and 3?

mil2k




msg:155026
 12:19 pm on Jun 6, 2003 (gmt 0)

I compare this update to a Water filtration process to understand it better. This realization came to me when GG replied to one of my questions in the very beginning of the first Dominic thread.

In water filtration we have our raw material which is the Muddy Water {mud can be compared to our good old SPAM}. The most basic way to filter water is the Filtration Process. It involves filtering the water by using cloth like filter to remove most of the mud from the water. Then the other processes like percolation and UV rays etc is applied. But then you realize that the muddy water can be filtered better if you add an extra filter during the filtration level. Remember filtration is the first level.

To implement that you add an extra filter coating and start with the raw material from the beginning and then proceed to other levels. That's what they are doing now. Applying filters on the first level and then after weeding out unwanted stuff build the index by bringing in other factors like backlinks etc

What was happening was they were applying filters on the processed data for QC. I think the filter they are applying now is on the initial process of buildig an index. And that's the reason they are building the index from scratch. They will observe the results of this filter for search results. There will be many Poor Quality results in the initial stages because not all backlinks are accounted for as yet. Remember GG asked for feedback for SPAM after the Data centres stabilized. That feedback would directly go to QC and they will devise new filters to see how to combat those sites.

This is what i have assumed by reading all the threads just before dominic started. Hope i am right :)

h_b_k




msg:155027
 1:21 pm on Jun 6, 2003 (gmt 0)

Q2, Q3 and Q5:
they can do what they do as long as the serps are okay for their users.

Q4:
the only real problem I can see, is the high number of error 404 pages, which should be delete from the index.
googlebot is very diligent in spidering new pages and updating the index.

Q1:
it is same question for me.
may be they actually test the rolling update of the pr and link matrix.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved