|Google June 2003 : Update Esmeralda Part 3|
Continued from: [webmasterworld.com...]
Has anyone here ever heard of a Kalman filter? It's a mathematical way of building a model of the world. The math is pretty complex, but basically you try to build a model of the thing you're trying to represent. When you get a new data point, you update your model's estimate about the state of things.
Why am I talking about this? Well, Kalman filters have a knob that blends between how much you believe your model vs. how much you believe each new data point. If you tweak the knob all the way in one direction, you always trust the model and any new input just gets ignored. On the other extreme, you can ignore your current estimates about the state of the world, and only trust each new data point as it comes in. If you set the knob too far in that direction, the object you're trying to model jumps all over the place each time you see even a hint of new info.
Lots of people here are getting more stressed than they need to be--their knobs are turned a little too far toward worrying about the very last thing that happened: "Now my subpage is coming up higher than it should! Okay, now my index page is back and the SERPs look good. Gaaack! Now I'm showing well at DC but the subpage still shows up higher at FI! Too much pressure--I'm going to drink now, and start spamming every FFA I see tomorrow!" :)
If you look around, you'll notice not too many senior members posting here. They chime in every so often, but their knobs are twisted further in the other direction. They know that the index switchover takes a little time to settle, and they have the perspective not to get too worried about things right now, and in general.
I haven't posted much of my take lately, but if I could give advice, it would probably be: don't panic. Here's what I would expect. Probably about one data center per day will get switched to the Esmeralda index. You may see some improvements during the course of the switchover as ingredients get blended in as they're ready. I would expect another round of ingredient-adding after the index is switched over.
So: if you're really into Google-watching as a sport, I would check in once a day to see what data centers have been switched, and maybe to run 2-3 searches. Browse a little while, and then come back the next day. Find something fun to do at night besides poring over every last thing that GoogleGuy (or whoever) posts on WebmasterWorld. You'll feel better, I promise.
This is just my take. You're welcome to ignore it. But I mention it because during this index, I heard about a lot of good and bad searches from webmasters, and the more I dig, the more confident I am that things will turn out well.
|LOL...noob award of the year, after 274 posts. ;) |
Thank you! What do I get?
|when you search on WWW, the search in done on either 1 of the 9 datacenters randomly. (That's why people keep saying data is in flux on whatever in every update thread). |
I'm not that dumb! I just lost all my bookmarks and I'm too dam lazy to type them in!
[edited by: mahlon at 6:38 pm (utc) on June 18, 2003]
Googleguy, I think you see alot of people freaking out first because bad results are bad results for a month or two, not just for a few minutes. The existing "monthly update" thing means that people realize that their entire next 30-90 days can be down the tubes because of SE issues.
More importantly, though, is that the flux time of 3-7 days is also agonizing. Even if you get all the SERPs you had before back, it isn't unusual to see traffic swing greatly as you flux in and out of things.
Hopefully the merging of fresh and deep bot will mean that this cycle disappears and is replaced with more constant flux as new things are added and old ones are updated. That would keep the freaking out down to something more tolerable for all :-)
Actually, the search you perform doesn't really "randomly" choose a datacenter to run it on, there is a geolocation algo in place that points you to closer datacenters.
It would make much sense sending California searches over to Zurich :)
Perhaps you're ready to spill the beans on whether or not Google is now doing "continual" updating of its database, rather than a once-monthly update as we've encountered in the past.
Yep, Geolocation + Datacenter Load. That's explain why you won't reach the same datacenter all the time even if you surf from California all the time! :)
Gotcha. I thought you were saying something else.
Sure would like to know what is up with -sj anyway...
Good Esmerelda luck... mipapage
Well, I'm glad to see I'm not the only one who had a site totally disappear from the serps this morning. It's not a new site and I hadn't changed it in a while. I'd been celebrating its new rankings in the -fi index.
I'd been wondering if I should consolidate all my sites under one domain to keep things simple (they all feature products made by the same company, but I'd created a different site to focus on each product or group of products), but this definitely answers that question.
I'm not panicking about it, but wondering how this happens. Assuming that it isn't penalized, since there's no reason for it to be. Can part of the index become unavailable due to a hardware failure? Surely there's a ton of redundancy in the googleplex.
Fingers crossed that these sites come back, and I'm going to force myself to forget about it for a couple of days.
>> Well, I'm glad to see I'm not the only one who had a site totally disappear from the serps this morning. <<
You're not.... the sites I submitted to GG a couple of days ago have gone again.
>> I'm not panicking about it.. <<
The fact that they are in and out suggests that they will be IN again when the chips fall. I'm guessing that the data is shuffling around a bit and that there is still a way to go before stability.
I'm pretty certain there's no general penalty in play. The cross section for the missing index problem was simply too wide, and some of the sites were patantly not SEO'd at all.
It is a panicky time of course, and unnerving when your own sites are not there on www2/3/-fi. It is also somewhat addictive: I know I shouldn't look until the end... and I should know better.... but I just HAVE to take a PEEK every now and again! Of course I instantly regret it when my sites aren't there!
Self therapy: "sit tight and stop looking".
[edited by: Napoleon at 5:37 pm (utc) on June 18, 2003]
|If my site was a year old, and had been in Google all that time, I would be much more likely to focus on getting users to return to my site. But that is not the case for me. They can't return - because they haven't been there yet! |
See my reply at:
Hey folks, just thought I'd mention that I had quite a few pages that were in the ODP which had some odd entries in the Google directory the past few months. I'd change over a domain name at the beginning of the year and the pages on the new site were doing really well in Google until the Dominic index was installed. Then some of the links from the Google directory, the ODP and some other sites reverted back to pointing at the old URLs. This caused my rankings to slide because my links were then being divided between the old and new pages so neither page scored really well in the SERPs.
I just noticed on -fi that, for the couple of pages I just spot checked, the Google directory and ODP entries were now in sync and both pointing to my new URL.
So if you had a page that was ranking low because the ODP and/or Google entries were off, you may wish to recheck your position today.
[edited by: Jane_Doe at 6:51 pm (utc) on June 18, 2003]
"If you were to for a nice walk around the lake, you would be in the exact same position in google as if sit watching every data center. If you spend that time working on your site, you would be in the same position with this update, but you might improve your fresh listings and your position in the next update."
BigDave, I want to print that out, frame it, and put it over my desk. You totally grok it. AthlonInside, very wise words about data centers and SJ. Everybody needs a sandbox to play. :)
RawAlex, I liked your quote too: "Hopefully the merging of fresh and deep bot will mean that this cycle disappears and is replaced with more constant flux as new things are added and old ones are updated. That would keep the freaking out down to something more tolerable for all :-)" Overall, I'm in favor of anything that would reduce the stress level of WebmasterWorld--sometimes it's a little like being in a dark auditorium with 20 other reasonable people and 2-3 fear-maddened wolverines. ;) Maybe webmasters will always be a little anxious to know how they're ranking, but I'd love to get to a place where everyone can be less worried about how they're doing and will be doing. :)
GoogleGuy, it's high time for your GoogleGuyBot to jump in.
But someone turned off it's post rights. :-)
Metablue: If each of your domains is unique, and the content is unique, you will probably attract more traffic by keeping them seperate (and perhaps ALSO consolidating the information into one overall domain as well). Don't ever look at getting more traffic as an either/or situation - when in doubt, try both!
Googleguy: thanks. Yes, those 2 or 3 wild people can really raise EVERYONE's stress. I noticed some "freshbotted" stuff in the SERPs today, so I assume freshbot is keeping on keeping on, even as the update continues?
Thank you firstly... and ... burp... (loved that comment).... I would have to say that for those site owner who are the wolverines in the dark, it is always best to watch your pray and look for weakness, those weak targets will surly be obvious and such that survival of the fittest is what created man... it is best to be patient and look for the weakest link to jump the gun.
Like the movie a "Beautiful Mind" watching how people do things will pay off huge in the long run, make a site that is user friendly, I mean it, if it is more informational for the user and easy to navigate in a logical sense then Gooogle will reward you, I know this works folks!
Think about what is easy on the user and the most descriptive but add the touch of navigation and simplified thought to how user may react to what he is currently reading, decide, react.. and implement that into your page.
I come from a Forbes top ten Quality Assurance Team here in Southern California, I have seen it all but I know what works best. As the military (Lingo) boys and gals say (KISS) "keep it simple stupid" ... think of the user and make the site function friendly as opposed to sell the product, it works better folks! :)
In ending - "A mans dreams are an index to his greatness"
I think its time to say goobye to the "too" big G.
Its a very unstable search engine, with results going up and down like a yo-yo, causing stress and un-necessary updating. I get banned, spend a couple of months fixing it, get Indexed well, work on the programming (keeping it G safe), and this update my site hits rock bottom AGAIN. AllTheTime another search engine has been very stable.
so while G gets its engine stable, MarkDidj (and others probably) takes off the google-toolbar (more hinderance than help to site designers from my experience) and moves their homepage to a more stable SE.
Whats the use in a "stable" search engine for JoeBlow surfer?
An unstable search engine is one that is unable to produce *any* results or is too slow.
Google has never had that problem.
|Its a very unstable search engine, with results going up and down like a yo-yo, causing stress and un-necessary updating. |
I'm not sure I agree with this. Having a site that has ranked in the top 10 (until this update - but that is actually my fault) in a very competative search field (primary term returns 1.3m pages)for the last 8-10 months I feel that Google is actually very stable. My competition has remained fairly constant, with a few changeups here and there, but for the most part my fellow Top 20 has remained the same.
google is very unstable....sites in..sites out...index pages in...sub pages way down..un relevant pages returned....never never has google been this bad...just once last september did anything like this happen....and the next update it was back to relevancy....is google broke?...i think a lot of us know that it is....cause your site is stable doesnt mean its not broken....!..al sites should be stable if they have content and play by googles rules..even though its own filters cant police those rules..!
<<If you were to for a nice walk around the lake, you would be in the exact same position in google as if sit watching every data center. If you spend that time working on your site, you would be in the same position with this update, but you might improve your fresh listings and your position in the next update. >>
That was "the straw that broke the camels back" for me. Could you have possibly posted something less useful and more patronizing than this?
Every tiny "2k per month" newbie webmaster knows to bake up a ton of freshbot spider food right now. Wow, if I walk around the lake 3x my contact page will still be #1!
Maybe, the best way to improve your listing for next month is to analyze what the #@#$ is going on before everyone else figures out the spam of the week trick. Sure, some SERPS will flux, but it would take a real dullard not to see any patterns within the results yet. This is a new algo. Doing the same old thing may not be nearly as effective anymore. (here is where you and googleguy go into your build great content spiel):).
So, PLEASE, if you are posting to try to impress someone and have no thoughts at all about anything that's going on other than, "Work on your site, there's nothing you can do about it", STEP AWAY FROM THE COMPUTER.
Worse than the "my site has dropped 400 places posts" are these pompous posts from senior members that feel they are on some pedastal talking to infants.
>Overall, I'm in favor of anything that would reduce the >stress level of WebmasterWorld--sometimes it's a little >like being in a dark auditorium with 20 other reasonable >people and 2-3 fear-maddened wolverines. ;)
Anything? I think it would help matters if you could give us your best guess on when we will stop seeing massive position changes in the SERPS for our websites.
|google is very unstable....sites in..sites out.. |
Unstable to you maybe, the webmaster who's site comes and goes.
But when my grandma searched for "day beds" this morning she didnt complain about how unstable the results were.
And then when my wife searched for "orlando vacations" she was real happy with the sites Google showed her, no complaining about instability.
Keep in mind what Google is here for ... a convenient service for web surfers.
Hey my site has been in and out of the serps too. Keep in mind folks these are *FREE* listings. You have be happy with anything you get and roll with the punches.
"New PR can be seen by setting up your HOSTS file to point to one of the datacenters with the new index."
Definitely not true. This PR is not "done".
I would agree with you if there really was much analysis going on, but there isn't.
Sitting around trying to be the first one to call the index moving to a new data center is not analysis.
Looking at only your own site is not analysis. Posting I'm moving to ATW is not analysis. Complaints about the results not being stable is not analysis.
In fact, few of the posts that are coming up with theories do count as anaylysis, because most of the posters grab hold of an immediate pet theory and never let go.
Now there are a lot of people posting that they dropped big time. Where is the discussion about what those sites might have in common? That would be analysis. That happens in other threads. The reason it is not in the update thread is that most of the members that do that sort of thing avoid this thread.
Sure, MyWifeSays. The fourth paragraph of my first post in this thread was intended to let webmasters know what to expect. I would look for roughly one data center per day to switch to the new index. Expect to see more scoring factors percolating throughout, and probably some more percolation after all the data centers have the index data. I would still be ready to see changes after that too, but probably of a smaller magnitude.
Hope that helps,
Thanks for getting back to me GG, still no clearer though. Approx. how long will it be 'til all the percolating stops and we get only small changes.
I suppose the question after that is how long do we only see small movements before percolating starts again? I think it's the big movements that are causing the high stress levels.
"Hopefully the merging of fresh and deep bot will mean that this cycle disappears and is replaced with more constant flux as new things are added and old ones are updated. That would keep the freaking out down to something more tolerable for all :-)"
Hardly. Prior to recently updates occured almost instantly. What first appeared, in terms of backlinks and results, stuck 99+% of the time.
And, more to the point, it was pretty accurate.
What is going on now is totally different than what might be called now a classic update. Sites rocketing all over the place is certainly not useful to users and is unpleasant to webmasters.
There are two issues here. The first involves us going from a system where Google got things (more or less) right, to a new system where we don't know yet if they will get it right. A constant update where Google *succeeded* to the same degree as they did in classic updates would be fine and probably a better way to go than one major update a month.
However, the second issue is "can Google pull this off?" So far the evidence we have is a resounding "no".
Some of the most straightforward backlinks on the Internet that there is no excuse to miss have once again been missed by the crawlers. If you can't crawl the Google Directory properly, that ain't very comforting. Listing newhoo pages instead of dmoz pages is similarly just terrible.
Of course I'm hopeful they succeed in ranking sites well and accurately by the end of this month, and then after that, but they have to prove they have put their problems behind them, and with the current batch of backlinks showing they plainly have not.
|Maybe, the best way to improve your listing for next month is to analyze what the #@#$ is going on before everyone else figures out the spam of the week trick. |
If people would spend less time figuring out the spam-of-the-week trick and optimizing for a moving target, they might have less reason to complain about Google's "instability."
|Doing the same old thing may not be nearly as effective anymore. (here is where you and googleguy go into your build great content spiel):). |
I'm not BigDave (to whom you addressed your diatribe), but I've got to say that "doing the same old thing" by building content and providing easily digestible spider food in the form of descriptive titles, anchor text, headlines, etc. has worked pretty well for me. It's certainly a lot easier on the blood pressure than trying to anticipate and outwit Google.
Here's something else to keep in mind: When you optimize for one or two things, you're putting most of your eggs in one or two baskets. If Google snatches one of those baskets away, you're a dozen eggs short of an omelet. That's why the content approach is more productive (or at least more stable and less risky) over the long haul.
Outside of all the other concerns posted regarding G.
I have to thank and absolutely agree with GoogleGuy's first and consequent posts (part 3), it really is the way it is - worrying and moaning about problems does not solve them, thinking off solutions, and actioning them does.
>>"New PR can be seen by setting up your HOSTS file to point to one of the datacenters with the new index."
>Definitely not true. This PR is not "done".
Definitely True. The PR is done in datacenters with the new index. Come on, PR is part of the index.
"I've got to say that "doing the same old thing" by building content and providing easily digestible spider food in the form of descriptive titles, anchor text, headlines, etc. has worked pretty well for me."
And if you stop fixating on your site and look around, you will see that those things have not been universally valued the past six weeks.
<<If people would spend less time figuring out the spam-of-the-week trick and optimizing for a moving target, they might have less reason to complain about Google's "instability." >>
I am not complaining about Google's instability. If anything the increased importance of fresh makes ranking quite easy.
Also, I happen to be doing quite well, as for usual in the new index. I prefer to, however, stay in front of the curve.
As to putting all my eggs in one basket. This is why myself and nearly everyone here works with a multitude of websites.
Lastly, if you catch the new "spam flavor" at the right moment, you can make a ton of money. Who cares if it's fleeting? A domain costs $10.
Europe, You ARE correct though, that building content sites in general will be a much more steady and less stressful business. It is not, however, the business that all webmasters are in.
[edited by: mfishy at 7:57 pm (utc) on June 18, 2003]