Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
Why am I talking about this? Well, Kalman filters have a knob that blends between how much you believe your model vs. how much you believe each new data point. If you tweak the knob all the way in one direction, you always trust the model and any new input just gets ignored. On the other extreme, you can ignore your current estimates about the state of the world, and only trust each new data point as it comes in. If you set the knob too far in that direction, the object you're trying to model jumps all over the place each time you see even a hint of new info.
Lots of people here are getting more stressed than they need to be--their knobs are turned a little too far toward worrying about the very last thing that happened: "Now my subpage is coming up higher than it should! Okay, now my index page is back and the SERPs look good. Gaaack! Now I'm showing well at DC but the subpage still shows up higher at FI! Too much pressure--I'm going to drink now, and start spamming every FFA I see tomorrow!" :)
If you look around, you'll notice not too many senior members posting here. They chime in every so often, but their knobs are twisted further in the other direction. They know that the index switchover takes a little time to settle, and they have the perspective not to get too worried about things right now, and in general.
I haven't posted much of my take lately, but if I could give advice, it would probably be: don't panic. Here's what I would expect. Probably about one data center per day will get switched to the Esmeralda index. You may see some improvements during the course of the switchover as ingredients get blended in as they're ready. I would expect another round of ingredient-adding after the index is switched over.
So: if you're really into Google-watching as a sport, I would check in once a day to see what data centers have been switched, and maybe to run 2-3 searches. Browse a little while, and then come back the next day. Find something fun to do at night besides poring over every last thing that GoogleGuy (or whoever) posts on WebmasterWorld. You'll feel better, I promise.
This is just my take. You're welcome to ignore it. But I mention it because during this index, I heard about a lot of good and bad searches from webmasters, and the more I dig, the more confident I am that things will turn out well.
LOL...noob award of the year, after 274 posts. ;)
Thank you! What do I get?
when you search on WWW, the search in done on either 1 of the 9 datacenters randomly. (That's why people keep saying data is in flux on whatever in every update thread).
I'm not that dumb! I just lost all my bookmarks and I'm too dam lazy to type them in!
[edited by: mahlon at 6:38 pm (utc) on June 18, 2003]
More importantly, though, is that the flux time of 3-7 days is also agonizing. Even if you get all the SERPs you had before back, it isn't unusual to see traffic swing greatly as you flux in and out of things.
Hopefully the merging of fresh and deep bot will mean that this cycle disappears and is replaced with more constant flux as new things are added and old ones are updated. That would keep the freaking out down to something more tolerable for all :-)
I'd been wondering if I should consolidate all my sites under one domain to keep things simple (they all feature products made by the same company, but I'd created a different site to focus on each product or group of products), but this definitely answers that question.
I'm not panicking about it, but wondering how this happens. Assuming that it isn't penalized, since there's no reason for it to be. Can part of the index become unavailable due to a hardware failure? Surely there's a ton of redundancy in the googleplex.
Fingers crossed that these sites come back, and I'm going to force myself to forget about it for a couple of days.
You're not.... the sites I submitted to GG a couple of days ago have gone again.
>> I'm not panicking about it.. <<
The fact that they are in and out suggests that they will be IN again when the chips fall. I'm guessing that the data is shuffling around a bit and that there is still a way to go before stability.
I'm pretty certain there's no general penalty in play. The cross section for the missing index problem was simply too wide, and some of the sites were patantly not SEO'd at all.
It is a panicky time of course, and unnerving when your own sites are not there on www2/3/-fi. It is also somewhat addictive: I know I shouldn't look until the end... and I should know better.... but I just HAVE to take a PEEK every now and again! Of course I instantly regret it when my sites aren't there!
Self therapy: "sit tight and stop looking".
[edited by: Napoleon at 5:37 pm (utc) on June 18, 2003]
If my site was a year old, and had been in Google all that time, I would be much more likely to focus on getting users to return to my site. But that is not the case for me. They can't return - because they haven't been there yet!
See my reply at:
I just noticed on -fi that, for the couple of pages I just spot checked, the Google directory and ODP entries were now in sync and both pointing to my new URL.
So if you had a page that was ranking low because the ODP and/or Google entries were off, you may wish to recheck your position today.
[edited by: Jane_Doe at 6:51 pm (utc) on June 18, 2003]
BigDave, I want to print that out, frame it, and put it over my desk. You totally grok it. AthlonInside, very wise words about data centers and SJ. Everybody needs a sandbox to play. :)
RawAlex, I liked your quote too: "Hopefully the merging of fresh and deep bot will mean that this cycle disappears and is replaced with more constant flux as new things are added and old ones are updated. That would keep the freaking out down to something more tolerable for all :-)" Overall, I'm in favor of anything that would reduce the stress level of WebmasterWorld--sometimes it's a little like being in a dark auditorium with 20 other reasonable people and 2-3 fear-maddened wolverines. ;) Maybe webmasters will always be a little anxious to know how they're ranking, but I'd love to get to a place where everyone can be less worried about how they're doing and will be doing. :)
Googleguy: thanks. Yes, those 2 or 3 wild people can really raise EVERYONE's stress. I noticed some "freshbotted" stuff in the SERPs today, so I assume freshbot is keeping on keeping on, even as the update continues?
Thank you firstly... and ... burp... (loved that comment).... I would have to say that for those site owner who are the wolverines in the dark, it is always best to watch your pray and look for weakness, those weak targets will surly be obvious and such that survival of the fittest is what created man... it is best to be patient and look for the weakest link to jump the gun.
Like the movie a "Beautiful Mind" watching how people do things will pay off huge in the long run, make a site that is user friendly, I mean it, if it is more informational for the user and easy to navigate in a logical sense then Gooogle will reward you, I know this works folks!
Think about what is easy on the user and the most descriptive but add the touch of navigation and simplified thought to how user may react to what he is currently reading, decide, react.. and implement that into your page.
I come from a Forbes top ten Quality Assurance Team here in Southern California, I have seen it all but I know what works best. As the military (Lingo) boys and gals say (KISS) "keep it simple stupid" ... think of the user and make the site function friendly as opposed to sell the product, it works better folks! :)
In ending - "A mans dreams are an index to his greatness"
so while G gets its engine stable, MarkDidj (and others probably) takes off the google-toolbar (more hinderance than help to site designers from my experience) and moves their homepage to a more stable SE.
Its a very unstable search engine, with results going up and down like a yo-yo, causing stress and un-necessary updating.
I'm not sure I agree with this. Having a site that has ranked in the top 10 (until this update - but that is actually my fault) in a very competative search field (primary term returns 1.3m pages)for the last 8-10 months I feel that Google is actually very stable. My competition has remained fairly constant, with a few changeups here and there, but for the most part my fellow Top 20 has remained the same.
That was "the straw that broke the camels back" for me. Could you have possibly posted something less useful and more patronizing than this?
Every tiny "2k per month" newbie webmaster knows to bake up a ton of freshbot spider food right now. Wow, if I walk around the lake 3x my contact page will still be #1!
Maybe, the best way to improve your listing for next month is to analyze what the #@#$ is going on before everyone else figures out the spam of the week trick. Sure, some SERPS will flux, but it would take a real dullard not to see any patterns within the results yet. This is a new algo. Doing the same old thing may not be nearly as effective anymore. (here is where you and googleguy go into your build great content spiel):).
So, PLEASE, if you are posting to try to impress someone and have no thoughts at all about anything that's going on other than, "Work on your site, there's nothing you can do about it", STEP AWAY FROM THE COMPUTER.
Worse than the "my site has dropped 400 places posts" are these pompous posts from senior members that feel they are on some pedastal talking to infants.
>Overall, I'm in favor of anything that would reduce the >stress level of WebmasterWorld--sometimes it's a little >like being in a dark auditorium with 20 other reasonable >people and 2-3 fear-maddened wolverines. ;)
Anything? I think it would help matters if you could give us your best guess on when we will stop seeing massive position changes in the SERPS for our websites.
google is very unstable....sites in..sites out..
Unstable to you maybe, the webmaster who's site comes and goes.
But when my grandma searched for "day beds" this morning she didnt complain about how unstable the results were.
And then when my wife searched for "orlando vacations" she was real happy with the sites Google showed her, no complaining about instability.
Keep in mind what Google is here for ... a convenient service for web surfers.
Hey my site has been in and out of the serps too. Keep in mind folks these are *FREE* listings. You have be happy with anything you get and roll with the punches.
I would agree with you if there really was much analysis going on, but there isn't.
Sitting around trying to be the first one to call the index moving to a new data center is not analysis.
Looking at only your own site is not analysis. Posting I'm moving to ATW is not analysis. Complaints about the results not being stable is not analysis.
In fact, few of the posts that are coming up with theories do count as anaylysis, because most of the posters grab hold of an immediate pet theory and never let go.
Now there are a lot of people posting that they dropped big time. Where is the discussion about what those sites might have in common? That would be analysis. That happens in other threads. The reason it is not in the update thread is that most of the members that do that sort of thing avoid this thread.
Hope that helps,
I suppose the question after that is how long do we only see small movements before percolating starts again? I think it's the big movements that are causing the high stress levels.
Hardly. Prior to recently updates occured almost instantly. What first appeared, in terms of backlinks and results, stuck 99+% of the time.
And, more to the point, it was pretty accurate.
What is going on now is totally different than what might be called now a classic update. Sites rocketing all over the place is certainly not useful to users and is unpleasant to webmasters.
There are two issues here. The first involves us going from a system where Google got things (more or less) right, to a new system where we don't know yet if they will get it right. A constant update where Google *succeeded* to the same degree as they did in classic updates would be fine and probably a better way to go than one major update a month.
However, the second issue is "can Google pull this off?" So far the evidence we have is a resounding "no".
Some of the most straightforward backlinks on the Internet that there is no excuse to miss have once again been missed by the crawlers. If you can't crawl the Google Directory properly, that ain't very comforting. Listing newhoo pages instead of dmoz pages is similarly just terrible.
Of course I'm hopeful they succeed in ranking sites well and accurately by the end of this month, and then after that, but they have to prove they have put their problems behind them, and with the current batch of backlinks showing they plainly have not.
Maybe, the best way to improve your listing for next month is to analyze what the #@#$ is going on before everyone else figures out the spam of the week trick.
If people would spend less time figuring out the spam-of-the-week trick and optimizing for a moving target, they might have less reason to complain about Google's "instability."
Doing the same old thing may not be nearly as effective anymore. (here is where you and googleguy go into your build great content spiel):).
I'm not BigDave (to whom you addressed your diatribe), but I've got to say that "doing the same old thing" by building content and providing easily digestible spider food in the form of descriptive titles, anchor text, headlines, etc. has worked pretty well for me. It's certainly a lot easier on the blood pressure than trying to anticipate and outwit Google.
Here's something else to keep in mind: When you optimize for one or two things, you're putting most of your eggs in one or two baskets. If Google snatches one of those baskets away, you're a dozen eggs short of an omelet. That's why the content approach is more productive (or at least more stable and less risky) over the long haul.
And if you stop fixating on your site and look around, you will see that those things have not been universally valued the past six weeks.