Forum Moderators: open
I also notice it seems that Google has 2 banks of datacenters. Only one of the 2 does the partial update. Next partial update, the other bank of datacenters is used. Looks to me like -ex, -in, and -zu are involved this time. And, possibly -va. However, it could be that all of these are being rerouted all to one physical datacenter. Looking at traceroutes this may be the case.
I think people have come to expect a certain level of consistency that may not be offered with a "rolling update".I disagree. In fact, many "searchers" actually assume that Google is constantly being updated. A "once-a-month" update schedule just doesn't make sense unless you have some deeper knowledge about how Google works. Personally, I learned about the monthly "update" the day I found this forum :)
Many times when I'm searching and find something useful, I don't always remember the site or bookmark it, but I often remember the search terms that got me there. I think things might work the same way for a lot of Joe Average surfers.
So far I don't see the changes I attribute to the rolling update threatening that significantly, though. It just looks to me like an evolutionary step of freshbot behavior.
And at this point it's just exhausting, trying to figure out what to do. I'm not trying to come out top for a competitive one-word keyword. I'm trying to come up on the first page for a specialized four-word keyphrase. I just want to know what the right things to do are. I liked it better when people could tell me honestly and with real conviction that simply putting keywords in your content and getting links with relevant anchor tags from outside was really the best thing to do.
On a side note, i just checked my page. The changes I've been making haven't been updated yet, so this is a page that hasn't been changed since maybe a year or two ago, at which point it was just uploaded over a page that had existed since probably 1998. Nothing's different. Three days ago, we had a PR of 5. Googlebot was all over the site two weeks ago.
Now, the toolbar's gray. But we show up in Google searches at about #5 for the longer and more specific of our keyphrases.
It just doesn't make any sense, and I'm wholeheartedly glad that my living does not depend on the daily position of my site.
I don't see how Google can regard all SEO as bad. Surely ethical SEO is just what they want to see! Simply making sure that keywords that are relevant to you show up in your copy, your titles, your links, and your headings can't be bad. Why would they want to discourage that? It's ensuring a more relevant WWW.
Sigh. Too tiring. I'm going to go lie outside in the rare sunbeam that has just wound its way down through our stormy summer skies. Outdoors. Away from Google.
This is how I used to be, years ago before I became an SEO. I was a standard lazy surfer, who remembered HOW they found a site, as opposed to actually bookmarking it or remembering its URL.
Most of the time as a university student I was surfing on a computer that wasn't my own, remembering my query was by far the easiest way to find something a second time.
Lack of consistency is something the general public will notice.
This is a concern. I am assuming with a continuous, rolling update that *if Google does it right* that searches a couple days later won't be that much different. The same site might be a bit higher or lower on the SERPs, but not dramatically. If this is not the case, searchers might not be pleased. And, also confused. Why would people think Google was giving the most relevant results if every 2 days they drastically changed?
However I am very confident that there is a rolling update going on.
However. I updated my page end of April beginning of May. On 4 datacenter my old page is still indexed. On CW and EX my new page is indexed. Another thing is that I renamed a lot of URL's the old files were deleted (custom 404) and now are removed from the index. But the new ones are not added (yet).
Perhaps what we're all missing here is that it won't. The Web changes a lot daily, but not *that* much. Over time, as the algorithms (i cannot spell that word) take effect, as the fresh/deepbot hybrid has time to cover the whole Web, as the spammers get weeded out, as the PR issues get resolved, the results will stabilize.
How the algorithms will stabilize is anyone's guess. One hopes that they'll favor those of us who use ethical SEO (it would take so much less time, if only it worked...), or those unoptimized pages that were decently-designed to begin with. And the PR issues... we'd like them to return to normal... they'd have to, for it to make any sense.
But even if the whole web was updated daily in Google's index, the results would have to stabilize. Not for certain very high traffic keywords, I suppose-- if Blue Widgets has 3,000,000 results, and thousands of people working to optimize their sites, and some spamming to place higher, then the results will keep fluctuating as every little tweak gets uploaded, and as the spammers get caught, etc.
But that doesn't mean the results would fluctuate so wildly. Even the highest-traffic keywords don't have 1,000 totally new sites per day. If there's a solid site with good PR and high relevance, it should always be up somewhere near the top. So you'd still be able to find it the next day, though it might not be in the same place. I wouldn't mind that... it's the dropping it entirely that bugs the hell out of me.
And for most of the Web... I mean, how many people are really truly searching for "revolutionary war socks"? You can search that term as many times as you like and still get revwarsocks.com over and over. (disclaimer: i have no idea whether that site exists or not. I doubt it.)
And that would be the idea. Some instability, but predictability, and lots of fresh content. They can't possibly intend for the final version of the algorithm to be "Random Filter On For Maximum Confusion". They just can't.
Surely they're trying to do what I described above.
I just don't understand what's taking so long that John Q. Public has had time to notice it and **** about it to me. (Gramma, I don't control Google. I can't make it find your quilting sites again. I just try to show up in its results. If I controlled Google, you wouldn't need to put money in my birthday cards.)
From the keywords and keyword phrases I typically look for everything has being very fluid and the results that are coming up aren't exactly the best pieces of information available for those keywords but just the pages with the most links to them containing those keywords. It sounds horrible but that is what I'm seeing.
Today I log in and see sites that in the past were banned and have returned to the SERPs with no changes to the sites themselves.
At this point I honestly have to think that something bad happened at Google and now they are trying to clean up.
I don't know what if a continuous update is now going on but I truly do not see an end to this problem. I can't afford to sit around and wait for things to get better with Google.
The old "throw enough &^%$# at the wall and hope some of it will stick" strategy is looking good but it is also a double edged sword. It pushes one to make more content pages but under the circumstances, I predict most of those new content pages will not be strong pages. Just pages created to carpet bomb Google into generating enough traffic to survive.
I disagree. Their ego is too huge to admit that. If they did screw up, they would tell you it was all part of their master plan.
Nothing wrong with that. Use different strategies for different sites. Some will work well and some won't.
The risk of a few sites being banned by Google is far outweighed by the potential for big profits. Besides, is there really a difference between being on page 14 for 3 months or being banned?
If GG wants to mix up SERPS everyday, I'm certainly not gonna let others take my piece of the pie on the "odd" days.
Google created this atmosphere and until Yahoo dumps GG, the webmasters who adapt will do well and others will whither away.
Yes twilight47, same for me. The thing is, I have gradually seen my main site disappear one by one from each datacenter for my main keywords that have been in the top 10 for about a year. I have been patiently waiting for this update to finish but the results are getting worse and fluctuating more as time goes on. BTW this is my first post in any update thread but I am becoming increasingly concerned.
From where I am sitting, ex, in, & cw are showing one set of results whilst all the others show a different set. I am hoping to see this settle soon.
The days of concentrating on one solid, well built and spam free site is becoming history.
Yah, if there is any logic to the progression it seems that the new/filtered data is moving to in and cw. Of course, I am certain that I am wrong :)
Zapista- More nets more fishes :)
One of the nice aspects of Google is you can test all your theories by implementing them on new, unrelated sites. You would be amazed at what works and doesn't.
It looks like these 3 datacenters above are running one algo, the others are running a second algo, and as soon as one updates Google switches to that on www. This explains why people keep seeing radically different SERPs on different days. Perhaps Google is running 2 algos at once because they don't know which they want to go with yet? This whole thing has the feeling of a live beta test.
[edited by: rfgdxm1 at 10:36 pm (utc) on June 24, 2003]
I think we're almost there with the continuous updates i.e. the only way to achieve it is with a large amount of data centre’s i.e. 7 - constantly updating, however I don't believe that when continuous update has been successfully implemented that the data amongst all of them will differ so much. How can it? Therefore I would say the signs are there, but we're not quite there yet.
Google seems to be going after people that built SEO pages / search engine friendly pages at the price of good serps.
I gotta disagree with that. My internal pages are just as cleanly SEO'd as my index, and they are riding higher than ever & have seen me through this dizzying spell of tail-chasing my home page around in the SERPs.
Yup, time to bring out the spambots.
The fair game is over.
Someone over there decided it would be a good idea to ignore their own policy documents on what they considered manipulating. One would have thought they had grasped the idea behind Page Rank, but obviously not. If they had instead increased the importance of page rank they would not have had these problems.
I really doubt the update is continous. They are turning on and off the new filter though, which is causing flux ...
Yesterday it was off on all DCs for a while then it turned on again today. And this is not the &filter=0 stuff, rather something which is adjusting keyword weight depending on anchor text.
I suspect the target might be the .biz and .info(and the notorious .us) one link-web spammers. But a small adjustment downward would have been enough to get rid of them. As it is now its pushing lots of sites down towards page 20.
The same way I referred everyone to google 3/4 years ago, I'm now starting to send people to alltheweb. It might not be perfect and a bit slow on the uptake, but at least its consistent.
Even if this is the beginning of the continuous update, I think what Napolean posted has a lot of merit (incorrect link info is being used). This is why we see such huge shifts.
I don't think Google's continuous update would show such wild and major jumps once it's running smoothly and efficiently. It just doesn't make sense that a site that was "relevent" to be #1 one day, should be #500 the next, just to show back again a week later at #1 again.
Therefor I think there are significant bugs in it now, data is still be accumulated/assessed, and we should just sit tight and see what the end result is before making any analyses ---- or at least wait to see a more smoother continuous update with less spikes before we can say for certain or not it is a rolling update.
Within 5 days, I checked "allinurl:www.mydomain.com" on WWW and found different results!
5 days ago: results:675
3 days ago: results:356
today: results:593
I have never seen this kind of fluid before on WWW. At the same time, my site position changed little!
I chekced that on 5 google datacenters, and found the results are different.
I am confused, what on earth is google doing now!