Forum Moderators: open
Some had me convinced at one time that what shows at these datacenters doesn't matter- to stay focused on www.google.com where the "final edition" ends up (used to anyway). nice, but this really explains nothing even if you watch it forever. Dig Deeper:
Keep any eye on number of listings being returned on your www.google.com key-phrase search (refresh a few times, and watch it change.), and start matching to each datacenter, which you can do quickly if you have them bookmarded... You'll quickly see what is going on. It appears as though www.google.com search queries are being load balanced amoung at least 9 if not more datacenters(document servers at least). You'll quickly see your ranking at each datacenter do matter because of this, and that each datacenter can be different in results and is.... and that this goes a ways to explaining constanly fluctuating results with www.google.com, along with freshbot temporary space/time warping on results during the month..
I can now see the number or listings returned on a keyphrase search and accurately predict where we will fall, because I know where we fall on that term for each datacenter...
It may not help you rank better, but it can help explain www.google.behavior. And if you are doing well on let's say 7/9 datecenters on a given search term, you should do well I would imagine 7/9 searches on the term until things change at the datacenters.
I actually like this, because I've noticed that paying attention to algos is pretty much a waste of time. Make sure to keep adding content and keep up to date with what's going on mostly to make sure that you are indexed. The rest will fluctuate but work out for the better over time.
I'm sure some people do well optimizing and while it lasts get great results. But when the algo changes they have to throw out their work. Much more productive to keep building content and you'll always have something ranking well no matter which algo G chooses.
Mind you, this keyword used to bring up my site (pre-May, 2003) anywhere from #3 to #12. I'm not sure I'd call it an improvement, but it's better than what it was last week. :(
Dick
Besides maybe advanced load balancing techniques to improve performance or accomodate an ever growing database and/or visitors, what is/was the puporse of what they are doing? Never seemed to see much of a clear answer....
It doesn't really matter. The things that have always seemed to work to optimize for Google, the legal things, seem to all still work fine. Our rankings stayed and continue to stay pretty much the same since this newness all started with Google a couple months back.
We aren't going to suddenly to try to optimize for all possible google datacenters and maybe differing algo's. We aren't going to do anything different. Our web site has content for our visitors- such content done in a way so the spiders can read the content as well as our visitors:
For instance we use html text instead of .gif files to display text, for benefit of the spiders, and hopefuly our relevancy rankings...., the same information is provided to the visitor and the search engine bot, however just not as pretty as we could do it in adobe..... Luckily in our case our industry doesn't demand visually striking web sites, cause ours ain't....
is there going to be a monthly update? Looks like it, as many new pages have not been added by the so called FreshDeepBot.
Something we saw about ten days ago...
I'm seeing new pages in the index as of today - not 'fresh', but updated. Like the 'update' that happened on the 10th... This is the newer Google I guess..
But, I am not seeing an update title tags, etc of deep pages already in the index. The last these were updated was June mid.
Also, no change in backlinks or PR.
Pages have been shifting around a bit in SERPs