Forum Moderators: open

Message Too Old, No Replies

Understanding the newer google

multiple datacenters/ diff results/load balancing.

         

abcdef

12:14 am on Jul 21, 2003 (gmt 0)

10+ Year Member



If you want to understand what Google is up to, find all the datacenter URL's you can (I have 9 accumulated form this forum over time), and bookmark them.

Some had me convinced at one time that what shows at these datacenters doesn't matter- to stay focused on www.google.com where the "final edition" ends up (used to anyway). nice, but this really explains nothing even if you watch it forever. Dig Deeper:

Keep any eye on number of listings being returned on your www.google.com key-phrase search (refresh a few times, and watch it change.), and start matching to each datacenter, which you can do quickly if you have them bookmarded... You'll quickly see what is going on. It appears as though www.google.com search queries are being load balanced amoung at least 9 if not more datacenters(document servers at least). You'll quickly see your ranking at each datacenter do matter because of this, and that each datacenter can be different in results and is.... and that this goes a ways to explaining constanly fluctuating results with www.google.com, along with freshbot temporary space/time warping on results during the month..

I can now see the number or listings returned on a keyphrase search and accurately predict where we will fall, because I know where we fall on that term for each datacenter...

It may not help you rank better, but it can help explain www.google.behavior. And if you are doing well on let's say 7/9 datecenters on a given search term, you should do well I would imagine 7/9 searches on the term until things change at the datacenters.

Clark

3:11 am on Jul 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Good point. So now to do algo calculation and "optimization" requires psyching out the different algo at all 9 datacenters...each of which probably change separately from time to time.

I actually like this, because I've noticed that paying attention to algos is pretty much a waste of time. Make sure to keep adding content and keep up to date with what's going on mostly to make sure that you are indexed. The rest will fluctuate but work out for the better over time.

I'm sure some people do well optimizing and while it lasts get great results. But when the algo changes they have to throw out their work. Much more productive to keep building content and you'll always have something ranking well no matter which algo G chooses.

dickbaker

3:41 am on Jul 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



While I've only been watching (testing) a keyword for one site, I've noticed a difference. A few days ago, that keyword brought up my one site at #393 or worse on six of the nine datacenters. The other centers didn't even have the site indexed. Now one of the datacenters has the site at #11 for that keyword, and all other datacenters have it indexed.

Mind you, this keyword used to bring up my site (pre-May, 2003) anywhere from #3 to #12. I'm not sure I'd call it an improvement, but it's better than what it was last week. :(

Dick

abcdef

4:59 am on Jul 21, 2003 (gmt 0)

10+ Year Member



That just brings up a good question, Clark:

Besides maybe advanced load balancing techniques to improve performance or accomodate an ever growing database and/or visitors, what is/was the puporse of what they are doing? Never seemed to see much of a clear answer....

It doesn't really matter. The things that have always seemed to work to optimize for Google, the legal things, seem to all still work fine. Our rankings stayed and continue to stay pretty much the same since this newness all started with Google a couple months back.

We aren't going to suddenly to try to optimize for all possible google datacenters and maybe differing algo's. We aren't going to do anything different. Our web site has content for our visitors- such content done in a way so the spiders can read the content as well as our visitors:
For instance we use html text instead of .gif files to display text, for benefit of the spiders, and hopefuly our relevancy rankings...., the same information is provided to the visitor and the search engine bot, however just not as pretty as we could do it in adobe..... Luckily in our case our industry doesn't demand visually striking web sites, cause ours ain't....

Namaste

3:25 pm on Jul 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



is there going to be a monthly update? Looks like it, as many new pages have not been added by the so called FreshDeepBot.

the only change I have seen since results stabilised 2 weeks back is that On Page factors are counting more than ever before.

mipapage

4:03 pm on Jul 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



is there going to be a monthly update? Looks like it, as many new pages have not been added by the so called FreshDeepBot.

Something we saw about ten days ago...

I'm seeing new pages in the index as of today - not 'fresh', but updated. Like the 'update' that happened on the 10th... This is the newer Google I guess..

Namaste

6:17 pm on Jul 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hmmmmm...after reading your comment I checked Google. I am seeing new pages added too...but not comprehensively like in the monthly updates. Some are added and some are not.

But, I am not seeing an update title tags, etc of deep pages already in the index. The last these were updated was June mid.

Also, no change in backlinks or PR.

Pages have been shifting around a bit in SERPs