Forum Moderators: Robert Charlton & goodroi
complain about a real problem
Can't see the sandbox being a "real" problem, or a problem at all. It's there to stop the SPAM. If I start a new company called Mesothelioma Lawyers Ltd you reckon I should show up in the top 500 purely because that's my company name?
Sure, the "sandbox", whatever it is, hurts some. It hurts people who are creating sites for free traffic. Many of them are spammers/freeloaders. It also hurts others. They - particularly anyone starting a new business with a business plan that relies on free SE traffic - are probably better off staying unemployed (or employed if they can find a job). Any new site starting off on the premise that free traffic will sustain it deserves to fail.
So, if you remove the sandbox as a reasonable cause for complaint, and remove most of the other whining, we'd reduce this thread to one page and those that can't even be bothered to read it will get a personal reply from Googleguy because he owes them.
[216.239.39.104...] 13
[216.239.39.106...] 13
(I'm nowhere in the 0s still as well)
These two actually show different results than both the other 13s and the 0s in my sample test SERPS.
Since the first day, I haven't seen anything on Google.com to indicate that it's getting input from anything other than the 0s from my location.
Mark
The #13 data centers are more like the old index with some more minor algorithm changes
I highly doubt that.
I think the 13s DCs are fresher. No matter which keyword I search for I get more results on the 13s than on the others which is a strong indicator that this is the fresher index. But it is true that algo-changes seem more present at the other DCs.
Google has developed an active algorithm and things will never settle down. They will continue to run different algo's in parallel on different data centers, and alter the algo in realtime in an evolutionary manner, based on realtime performance analytics.
The whole thing will forever be a dynamically changing soup of information, just like PPC is (minus the additional craziness of the auction process).
Regarding datacenter #13. Yes I like that one. It's kind of like looking at your tech stock portfolio in a 1999 issue of The Wall Street Journal. :)
Thanks.
Now I can see precisely where the every-five-minute-flip-flop is coming from...
It's those 13's! (How fitting.) {smile}
NOTE: I'm also seeing a different algorithm for the "description" on the 13s. (Not normal snippet stuff.) EDIT TO ADD: Also see several thousand fewer results on 13s than 0s.
Take that between the eyes fellow SEOs--now you have to optimize for two different algorithms! Imagine optimizing for double different height hilltops, or double different link relevancy algorithms, or double different link aging algorithms. Its going to be very hard. But, hey, at least you might rank in one index.
I can't help but laugh/cry at this latest turn of events :-)/:-(
http://216.239.39.104/
http://216.239.39.106/
I am #2 in 42.2 million results on www.google.com and most other direct datacentres, and #5 or #6 in 64.4 million results on datacentres like http://64.233.171.99/
I then did the same search on the above two quoted datacentres and was #2 in 41.2 million results. The top 10 is slightly different to the main Google results. I added &num=100&filter=0 to the search URL, and was still #2 in 41.2 million results. I pressed "enter" on the search URL in the browser address bar, two more times and the results suddenly changed: my site was now #6 in 60.6 million results.
I can repeat this over and over. The first three goes I get 41.2 million results. Attempt 4 onwards (sometimes attempt 3) the result changes to 60.6 million results. (Actually both these numbers vary from 41.1 to 41.6 million and from 60.1 to 60.6 million) on the same single IP over a few minutes.
.
Also on these inflated results I am seeing that for PDF files the snippet now matches the search query. Previously for PDF files the snippet was the first couple of dozen words in the file.
.
There are some anomolies. A spam site that I reported a few weeks ago (been at #4 or so for several years), has had its cache date stuck at 26th Jan for the last 2 weeks. The listing used to have a fresh date every day too, but has not done so for the last 2 weeks either. This is in normal www.google.com listings by the way.
However in the "massively inflated index" that spam site has now sunk 20 or 30 places, the cache date for it is stuck at 26th Jan, but the listing still shows a new fresh date every day in that index.
My own site is cached and fresh-dated every day, as are most of the sites in the top 20 or 30 results in the normal www.google.com listings. However, in the "massively inflated index" very few of those sites show fresh dates anymore even though they continue to be newly cached on a daily basis.
There is some sort of disconnect between the cache, the index, and the fresh tagging mechanism.
.
(By "massively inflated index" I mean the one that returns 64 million results, rather than the "normal" version which returns 42 million results for the last few weeks, or the "experimental" index with 41 miilion results).
You might ask why? Let me proove it.
Check the keyword of your choice at scr oogle and even more important: googlerankings (you should know how to find these sites, cant post urls here)
Ok, you did this 100 times ... didn't you? But now the difference. Do it again within a few seconds. And again. And again. Especially on googlerankings
Know what is happening? Position 3 position 100 position 3 position 100 ... and i cross checked with several tools including kw analyzers and stuff - its definitely like that. Google is switching ips. There is no Google.com at the moment - there are 50. Or less - or more.
Lets talk about the datacenters and the funny results you are getting. You use mcdar to check it or others, right? You find "good" dcs which are "bad" dcs after a few hours. And vice versa. You believe there is a good dc that will make it to the top and "spread"? Or may be a "bad" dc. Forget that.
why?
Ok, we're talking about Billions of pages. Rolling up and rolling back within hours is just impossible. We have 50 dcs - its impossible to handle this amount of traffic - even for google. Even if you have a glass fiber - these dc's are all over the world. Its impossible.
I do state here that this update is far from over. Never. And forget analyzing the dcs. They are switching ips / content in the background. There is no such thing anymore as a "dc". There are just some ip's which are serving different results - just like google.com.
I personally will stop to watch the dcs now...
If I directly access an IP such as http://216.239.39.104/ and do a particular search I get 41 million results. If I do it again, I get 41 million results again. However, on the third or fourth attempt on the same IP I suddenly get 60 million results instead, and continue to get 60 million results each time thereafter.
The previous notion of one IP being one datacentre is not true.
Our directory site has been almost impossible to find since March 2004.
Last Thursday our site reappeared in Google's new results, not exactly what we had a year before but at least we had the top 2 for our own unique name. This lasted until Sunday and then we vanished again.
The really strange thing is if you now search for our own name "unique company name" the Google directory category which lists our site is now at Number 4 and Number 5. Exactly the same page, same URL. What's going on?
Meanwhile, for example, google.com now shows "Super bowl" results without NFL official site in top 20.
[google.com...]
It seems unlogic for me, and i hope that 216.239.39.104 will be the right DC.
I agree that there is no Google.com per se, but it does seem that only a handful of data centers feed the results... none of the 13s from the above post are being used as far as I can see.
Rollo,
if that is true then maybe Google is keeping them around for reference purposes when trying to contrast differences between their newer datacenters and the 13's.
Someone should be working on a Google based sci-fi about now, with an episode about the 13th datacenter.
Including the forward slash at the end? I'm seeing a few duplicates where the first is like /Computers/ and the second is /Computers
Also seeing now the effect of Yahoo bungling up the linking of their Directory recently. Mixes of the Capital letter categories with lower case ones. Gazing into crystal ball... lower PR, and dupe content issues coming up.
That's exactly the opposite thing I'm seeing from about one week...
Even if maybe it's not possible being sure of something in this update, what you're saying it's just what I'm quite sure it isn't.
The actual SEPR including the ones on 216.239.39.104 are bad. I do a search in quotes for a 10 words text from my site and my site ranks on the 20th position – last position.
After December I’ve started rewriting all the text on my site because it was all copied by others. After 2 weeks I have to rewrite it again. With the actual SEPR what‘s the point of writing quality content? Someone will copy it and Google will give him credit.
The text its mine and some lazy spammers are ranking better then me for my own text. Why are the results great?