|The Google Update Schedule Thoughts|
Gleaned from datacenter updates
This is a recap and a prediction topic.
Recall that GG said that the -sj index would move to the other datacenters, then we would see backlinks/spam filters applied across the board.
If this is the case, and if the d.c. datacenter (-dc) just got the -sj index (I saw it bouncing around last night actually); and the cable & wireless (-cw) datacenter got the index on the 15th. *Then* it seems we have 2 days for the index to be propagated to each datacenter (this is a worst-case scenario, as the datacenters may be updated in parallel and may all pop up with the -sj index very shortly).
Since we have 5 datacenters left to go, that brings us 10 days in the future for all datacenters to given the -sj index...which brings us to the 27th. At this time we should see the backlinks/spam filters being applied at every datacenter as deltas/patches (if you will) and the real "dance" will be underway. It will, of course, take considerably less time.
Notice that the prediction of the 27th is, in my opinion, a worst-case scenario; we will most likely see things happen sooner.
I saw that up until 2003-05-16 that -sj and -fi have had my site showing in the SERPs based on a 2003-04-08 snapshot of the site (the site was new on 2003-03-20). This result had been showing for the last 5 weeks, but then the google cache, title, description, file size, etc (on those 2 datacentres) suddenly changed to be based on the page content of just a few days ago; so some sort of update has occured on those two servers. They still have results that are slightly different to the other datacentres, so different algo or whatever is being used. Newer data seems to have been applied to those datacentres for at least some sites.
Wooooww... 7 pages of annoying posts... (not all of them of course).
Sorry about my bad english first of all, is not my native lenguage and I'll try to do the best I can.
It's my first post. I'm not so new to the forum but just didn't feel I had something to post yet.
I think that it doesn't matter that google has the 80% of the searches and that it's "paying", if you want to rank well in goole, well, work hard, that's our job, if it was easy you wouldn't have work either, everybody will try to do it and well, I don't know, this situation is just not possible... I can't imagine it... I wouldn't be working on this...
Then I think that google has the power to do whatever they think is the best, they don't have to worry about not bothering the webmasters that are trying to reach the top positions, as what they have to do is study the behaivour of the searchers and try to give them the best results they can, if not their bussiness is the one to be gone.
For what I see, the error could be that google shows too much information with all this servers and other information you can get from it... I don't know why, this might have an explanation...
This is my first post and I don't wont to offend nobody, this is just the way I see things and I post it becouse I think it would be better for everybody to see it this way, you are always free to do what you want.
I do belive that if you have a respectable site, that gives a good service to the visitors and gives them what they are looking for, you will have backlinks from people recomending your services, you will have bookmark visits that are more important than any search rank, and you will earn a trust from the each client that enters your site.
If you want google just to get the visitor to "another step before getting to the store" for example, you are just making it harder. I don't know if I'm making my self clear... arggg! spanish please!
don't try to rank good in google, just set things up for the crowler to understand what you do, and work on your own site... this is my way of seeing things.
WW is an exelent tool for webmasters... it's THE PLASE to learn... thanks WW...
Hi to everybody... hope be posting soon... :)
|Hey, every little page counts. |
I agree with what annej said a few pages back.
In Brett's 12 month guide he mentioned you must have a newsletter. I have found a newsletter to be excellent in getting lots of unexpected key phrases ranking well in Google.
My site is optimised pretty well and I do ok with a range of key phrases. But I am always surprised as to the traffic that my simple newsletter pages bring in.
Maybe the reason is because I write the text in Notepad and then paste it into a web page after I send it out to the subscribers i.e I write it in plain language and don't stress too much if Google will like it or not :).
The best part is that it forces you to write new content, and it has to be fresh so you don't repeat yourself.
Hi g1smd (oops, I mean Botum)
Nice post and welcome, I agree with a lot you say, hope you continue to post and your English is very good.
[edited by: MHes at 10:48 am (utc) on May 18, 2003]
What in Heaven's name was wrong with the results in update Cassandra, that this month's radical change was needed?
It wasn't broke, and didn't need fixing.
I assume the post is talking about my post, as is not g1smd's first post! :)
Thanks for the wellcome and I'll post, but I'm more interested in learning about the algorithm, so I think I'll read more at this kind of topics.
Which topics will talk about this at WW?
Thaks again, and I'll try to get some sleep now, is 8 am here...! :)
Ealier in this thread, (back on Page 2) GoogleGuy said:
|If you have feedback about this index, do a spam report with your nickname, mention webmasterworld, but also include "dominic" in the comments. If people have general feedback or specific examples about this index, that's the best way to get the examples to us. I'm not worried about spammers that were banned and are back for a brief time--that will be short-lived. But if you have comments about searches that seem better/worse (preferably not just searches for your own site), send a spam report with "dominic" somewhere in the comments. We'll read what people have to say when they can give specific searches. |
I just sent you some feedback GoogleGuy - now I've got less than 30 minutes until the F1 starts - go Mark Webber!
My site is content-rich text-heavy on most pages. People arrive via a great variety of kw's entering on many different pages. I don't need little lectures whenever I mention that Google is dangerously large.
I swear some of you have Stockholm Syndrome. Loyal members of the Googolese Liberation Army or something.....
Hope this doesn't offend but what's the point of a forum if everyone is in total agreement
Whoops, I mean I disagree?
Just disagree with everything to be on the safe side, Critter.
Re-reading my post, I really hope I haven't p*ssed off too many people, but there's a tone of condescension here to the poor blokes who've vanished without a trace that gets annoying. I know Google is a business, owes us nothing, and genuinely is a very good SE, but it's got people dancing around like monkeys wearing little hats and begging for spare change.
Nice day out... think I'll go for a walk and drink some beer later.
|In Brett's 12 month guide he mentioned you must have a newsletter. I have found a newsletter to be excellent in getting lots of unexpected key phrases ranking well in Google. |
That is a great idea. We answer a LOT of tech questions via email and we could just cut and paste those into a blog type of page.
Nice to hear a POSITIVE and helpful suggestion. Isn't that what this place is for?
A lot of what you are seeing as condescention, might just be good advice with a little attitude getting thrown back at those that are showing a lot of attitude in their complaints.
Instead of being bothered by it, you might be able to learn from it.
BTW, I haven't seen anyone disagree with you that Google is too large. I think we would all love it if the other companies would get their act together and provide reasonable competition.
No offense taken. The 'lectures' are just the practical positon to take since we can't do anything about this. That doesnt mean that it still isnt frustrating, even depressing while we figure out what to do.
IMO, the appropriate rant would be "Google just get it over with already!"
"Google just get it over with already!"
|BTW, I haven't seen anyone disagree with you that Google is too large. I think we would all love it if the other companies would get their act together and provide reasonable competition. |
Competition is fine, but I don't get the "too large" argument. Long before Google came along, there were many indexes and directories that had dominant market shares. For example:
- The Reader's Guide to Periodical Literature (a magazine index used by libraries)
- Books in Print (used by nearly every bookstore in the U.S.)
- Literary Market Place (companies and people in book publishing)
- Hollywood Creative Directory (film and TV production companies)
- The Red Book (advertising)
Google has a smaller market share than any of the indexes and directories that I've just listed.
>Google has a smaller market share than any of the indexes and directories that I've just listed.
Oohhps, europe... really?
I send out a monthly newsletter and every time it brings a big jump in visitors over the next few days. Newsletter visitors are targeted visitors as anyone who bothers to sign up has a special interest in your information/product. Be sure to put a little something of interest beyond what's new on your site. If it just looks like an add it will more likely be deleted unread.
My suggestions about concentrating on lesser pages is advice to myself as well as others. Everytime I check fi, sj, etc and see how the home page of my main site has sunk my heart sinks too. Since it is even showing up this way on www some of the time I can't help but think this is it. I have a feeling that however this update ends up the rules have changed forever. We've got to be less dependent on traffic from a single word or phrase.
Google's Right to Operate as They See Fit
Too many webmasters IMHO seem to act as though Google were an association operating on the behalf of Webmasters. Associations exist to promote the best interests of their members. Companies are in business to make money. Google is a private company.
Companies can have pure or impure motives. They can treat their employees well or poorly. They can deliver high or low quality goods or services. These choices are all at the discretion of management.
Management teams understand that in any given category, different companies, using different combinations of goals, strategies and tactics, can succeed.
Google's strategy is essentially: SUPERIOR RELEVANCE. Google wants their search results to be more relevant than other resources on the Web. They want the ads served in AdWords to be more relevant than ads served at other Web destinations. Google, in other words, is on the side of the user...a strategy designed to lead to the success of their company.
People should remember: Google is not on the side of the webmaster. Google is not against webmasters. Google is a company, in business to make money.
Once you understand this, and the strategy of superior relevancy, you understand Google.
Webmasters' Right to Choose and Speak Out
Webmasters who choose to live and die on the fortunes of Google are making a decision to embrace a level of risk closely tied to Google's own behavior (which is out of control of the webmaster). That's a call every webmaster/business owner can make for him/herself. And those who make that call have the right to advocate their positions in this forum, as long as this forum is open and public. GG can ignore them. GG can listen. Ultimately, GG will take what he choses from all of this, and pass it along in a way designed to help Google improve RELEVANCE.
Webmasters who choose to employ business strategies in which 'free' traffic from Google represents only a part of their income are probably also choosing higher cost structures, in order to diversify and reduce risk. That, again, is their call. Those webmasters are also free to advocate their points of view in this forum.
Everyone - speak your minds! Advocate: In your own self interest if you wish, or just comment on the overall state of affairs. In the end, it's all to the benefit of everyone else in here, and that's what makes this a great forum. ;-)
Stefan, there was a lot of work at Google behind these changes. We're trying the make the transition as gentle as possible, but there's still a lot of work left to do.
gentle?...you mean kicking guys in the teeth who have played by the rules you set out?....the very guys who always told those who looked to go down the path of evil to keep the site clean..stick content on it....read the webmaster guidelines....in the long term you will win through....NICE PR BUT BŁ*LLSH!T....those guys who have plummetted from the index as opposed to being gently lowered.... these guys are being told youve made gentle changes....YOUVE REWARDED MEDIOCRITY AND SPAM....the rest of us are cheering on FAST!
I agree with you soapystar. I think GG says things will get better for the legit sites once backlinks etc gat factored in. BUT in the meantime we get screwed.
Oh my, forceful handwriting...
This is not necessarily the update per se, but a synchronization of the datacenters to prepare for new backlinks and deep data.
In other words, chill.
>> gentle?...you mean kicking guys in the teeth who have played by the rules you set out?. <<
Umm, if you are referring to results on -sj and the other test centres, then you haven't yet been kicked over. The vast majority of the surfing general public are totally unaware of those servers. Almost all searches are done via www and unless those results are being shown there, what goes on on the test servers remains unseen by the general public. Google may very well want to do a reverse listing making crap rise to the surface where it can be skimmed off or whatever. They can do whatever they like on theor test servers, but the results are meaningless for the general population because the general public will hardly ever see those results.
When you do a search on www it goes to one of the nine data centers, if I'm not mistaken. -sj, -fi, and -cw are three of those data centers. So currently some searchers are getting the "sj-like" results.
My understanding is that www2 and www3 are what you would call "test" servers. -sj is not a test server but one of the nine data centers that is seen by the public.
Uh, well, I guess you could say that -sj is temporarily a sort of test server right now - but a public one. :)
these "TEST" results have been on .co.uk all day, and www most of the day.....
Thats why im now in need of dentistry!:-)
>> When you do a search on www it goes to one of the nine data centers, if I'm not mistaken. -sj, -fi, and -cw are three of those data centers. So currently some searchers are getting the "sj-like" results. <<
I know. Thats why my post contained "unless those results are being shown there" and "because the general public will hardly ever see those results". They do see them occasionally, but until it is permanent don't fret on it.
|Thats why im now in need of dentistry!:-) |
How about some Xanax or Prozac!?
You should also consider adding vitamin B-6 to your diet. It helps increase seratonin levels ... which will help lower your anxiety.
Hey, my site is gone too but I know it will be back soon. Didnt your Grandpa ever teach you to save for a rainy day?
Well, its raining today and will be for the next few weeks.
I trust that the backlinks and anchor text will be added in over time as you have said. I guess I just don't understand why Google is even sending this index to the data centers BEFORE all the backlinks and anchor text are added/considered (i.e., before the real index is ready). I am not criticizing, just can't understand what the rush is to get this particular index out. I am sure you have explained this somewhere back among the 3000+ posts on Domenic, so no need to rehash it here. :)
I'm guessing the reason that they use one of the existing datacentres is simply that they haven't got the time or space to build another Google that stays offline. Did I read that a datacentre consists of 10 000 plain Linux boxes in a interconnected network?
Additionally, an offline test version of Google would then have to have people sat in their offices pressing buttons to do searches, or they would have to write more software that simulats people doing searches.
This way (the way they have done it with -sj), they have a server that most of the public never see but which unpaid (unpaid by Google that is) SEO guys love to click on and generate searches. So you are saving Google buying 10 000 computers and hiring hundreds of testers.
"there was a lot of work at Google behind these changes."
A lot more should have been done behind the scenes instead of in public. The idea of listing uk.co domains that no longer exist in anything "new" was a bad idea.
"We're trying the make the transition as gentle as possible, but there's still a lot of work left to do."
The last part is certainly true. Google loses millions of dollars of value each day that thier results are so poor, and more important, so poorly thought out.
Freshbot has always been a nice little extra, it's simply crazy to think "fresh" results are good on their own! New does not equal good. A deliberate choice to move from established, "voted" for quality to "fresh", unproven insta-pages is like moving from grown crops to manure.
If as evidence suggests, Google is simply unable to deepcrawl the Internet anymore, then we are all in big trouble, especially Google.
I've had faith that this ill-advised fresh fetish would get corrected in a couple of months, but the longer this mess stays online, the longer Google fails users. That should be scaring the crap out of the Googleplex and sending them back to the drawing board.
Put your resources into the deepcrawl, discern value and good content. Stop trying to be Altavista.
"is simply that they haven't got the time or space to build another Google that stays offline."
They want back two months because the March deepcrawl failed, but the February deepcrawl was also tarnished by dmoz/rdf/google directory incompletion. I suspect they have wanted to make this change since the January Superflux, and finally ran out of patience and used the best of the deepcrawls from this year.
It seems to me that a better idea would be to focus on the deepcrawl and get the first truly updated index since October. Then do the mad scientist bit after that.