Okay, Okay, Okay...So I am not alone! :> WTH Google?! Where are these SERP's coming from? Are they experimental test SERP's testing sandboxed sites for release or? And like both of you have said...These SERP's are coming from 220.127.116.11 and 18.104.22.168 and they are not showing up this way on McDar's...Nor are they showing up when I use either IP in the URL bar. Quite frankly, it's very ODD!
And guess what....Again, these SERP's are GONE!
Thank God I do not have clients that I would have to report this stuff to.
Please welcome Google Virtual Datacenters :-)
Could VIRTUAL be the new Infrastruture?! My sites are virtually there and then virtually GONE! :>~
"Could VIRTUAL be the new Infrastruture?! My sites are virtually there and then virtually GONE! :>~"
I wouldn't be surprised of anything about that new infrastructure thing ;-)
Well I did try today to ping and tracert [google.sk...] once I saw the new unique serps. But I felt I was redirected to the wrong IP all the time. Because when run my test keywords on the IP I got, the serps was not the same as the new unique serps I saw first time.
Now.. a question to kind fellow members who understand servers/datacenters technically:
Is it possible to hide an IP or is it possible to redirect a ping/tracert query to another IP than the real one?
Bed time. Good night and God bless.
Read My Post #23 [webmasterworld.com...]
In a load balanced server environment, it is very easy to add and remove servers from "pools", for testing, upgrades, meaning at any time, they can take a server, or group of servers out of a pool, make changes, and then toss them back in the pool. They can also place a higher priority on some servers, so that other servers only start answering queries under heavier load conditions.
A datacenter IP might have many pools of servers, all hiding behind a single ip address, and you would never know, because the response to your query, would always be returned from a single datacenter IP.
Only Google knows for sure how their "Infrastructure" is exactly setup.
You could say Google is like a box of chocolates,,,,,,,, but I will not say it.
Back to Watching
Edited to add, these are also known as server farms, if you need more proccessing to handle a larger load, you just grow more servers, add more load balancers creating unlimited capacity.
It was too late for me to add,
Picture in your mind, a single ip address (datacenter AKA loadbalancer), which is load balanced to 5 more load balancers, each of which is supporting 5 more load balancers, each of which has 50 servers answering queries. Creating A 1250 server datacenter, for supporting queries for that region.(if I did my math right, never was good at math, but I was good at designing, building and supporting these kinds of enviroments for some very well known corporations).
Back to Watching
After a break of several days, both 22.214.171.124 and 126.96.36.199 were back doing "wierd stuff" as of sometime in the last 24 hours, and then stopped again in the last hour or two.
|Is it possible to hide an IP or is it possible to redirect a ping/tracert query to another IP than the real one? |
the "real one" = the real loadbalanced IP address of said datacenter I want to test:
no it isn't possible for google to send you to a different IP. Google has to play by the rules with regards to DNS like the rest of the internet remember. Once an icmp (icmp=pings, traceroutes, path unreachables, etc) enteres their network is anyones guess.
Keep in mind, once you retrieve and IP address from a dns lookup to either ping google or to fetch a webpage the moment it enters their network they (or anyone for that matter) can send your request to anywhere or anything. your only barometers (really) are packet transit times and the fact that google wants to keep traffic as local to you as possible to keep their bandwidth bills down.
my birthday wish from google is a GWS header that tells some basic info about DC location and SERPs versioning or something. wouldn't that be swell?
Good morning Folks
Special thanks to our kind fellow emembers WW_Watcher and kilonox for taking the time to provide us with those much needed technical info.
So if I understood things correctly, there might be several servers hiding behind a single ip address of DCs.
Keeping in mind that different DCs get different data at different times (as per Matt Cutts), there is the "risk" that we wouldn't be able to really trace specific serps.
And it seems; that was exactly what happened for some of us yesterday. Personally, I saw the most wonderful serps on [google.sk...] , but I wasn't able to trace the specific server of the DC which contained those serps.
And that in turn confirm that the folks at the plex are able to produce good serps, but mightbe just testing and testing now all possibilities, and pick later the best of them.
And that lead me to say:
Google is still the best..better than all the rest :-)
Wish you all a great sunny day and a happy Google Datacenters Watching (including "Virtual" DCs Watching too).
Got Ya! Google Engineers :-)
So now I have screen captured the new unique serps I was talking about yesterday but which I also see right now this morning on [google.sk...] .
When mouse hovers over cache it shows IP [188.8.131.52...]
when run tracert or ping, I get the same as above.
However, the serps on [184.108.40.206...] are very similar though aren't 100% the same as the serps on [google.sk...]
What a wonderful way to start this great day with. Trust me... life is wonderful.
Long Live.. Google Virtual Datacenters Watching :-)
No idea if it's correct, but I'm getting 220.127.116.11 for www.google.sk . What's weird is that the cache IP shows as 18.104.22.168 which makes me think that I'd best re-check the IP for my default google. Re-checking my default says that it is currently 22.214.171.124 . Hmmm...
I still think the dataset to lookout for is on 126.96.36.199.
The real question is... when?
I can confirm that www.google.sk is the "wierd" SERPs that 188.8.131.52 and 184.108.40.206 have been showing for the last few weeks.
How do I know?
Well, I have a query that returns 20 results on most DCs and more than 900 on 72.14.207.* and now also on google.sk too.
I also ran 10 other queries that I know the results for, and all 10 confirmed the "experimental" results. I could run another 40 or 50 queries, but I think 10 is more than enough confirmation.
Expanding on the answers to Reseller's question
>Is it possible to hide an IP or is it possible to redirect a ping/tracert query to another IP than the real one?
Google could also be looking at the referring site or other request variables to cloak results. When you are asking for results at a specific IP, they may just randomize results to mess with your head.
|King of all Sales|
petehall - I have been watching that one and a couple of others just like it. In our sector, many more ecommerce sites appear in top positions.
I wonder if Google is playing with splitting ecommerce and research in a similar way that Yahoo Mindset does.
"Google could also be looking at the referring site or other request variables to cloak results. When you are asking for results at a specific IP, they may just randomize results to mess with your head."
In my case, Google is mostly showing me results I like but they don't exist on any of the DCs we know!
Its therefore Virtual DCs or DCs hiding behind the same IP though showing different serps than the rest of the DCs under the same IP.
Talking about the new infrastructure. What else are they going to serve for us poor Google Datacenters Watchers :-)
"A" datacentre consists of (probably) several thousand PCs. You get the usage of just one of them for a small fraction of a second to serve your result. No-one ever said that all those PCs have to contain the same data, and it is quite obvious that there are many times that they do not.
Unreproducible Google's Serps
Good evening Folks
My academic background is a Chemist. And in chemistry its rather important that the process and products a chemist develops should be reproducible. Otherwise they are just useless.
If we attempt to apply chemistry standards on Google's deployment of BigDaddy, new infrastructure, the everflux of the serps, any jr. chemist will tell you that BigDaddy process and Google's current serps are useless. Unreproducible product of Zero value.
But we know that chemistry standards can't be applied to Google and its processes and products. Therefore we can't just say that current serps are useless because they are unreproducible.
Maybe you have a term which describe best the current Google's infrastructure and its serps, and for that I thank you in advance.
Good night and God bless.
I have, but it isn't utterable in polite company.
All DC's looking very similar.
Not sure I'm keen on the end result either...
Shake up imminent? I mean, this surely can't be it?
As per your request to keep you updated, SERPS changed across the test DCs and the main .com this morning in my sector. Has become much more relevant, but still a little way to go I think before it's as good as it was.
And hasn't helped me personally one bit - lol :-)
I have noticed that if:
www.google.co.uk results show that [220.127.116.11...] (for instance) is the DC via the cache link...
Replacing www.google.co.uk with 18.104.22.168 in the search string produces entirely different results for our keywords.
In our case the www.google.co.uk are significantly worse than with the raw DC address.
This is the same with any DC google.co.uk may be using.
Why I wonder?
I'm seeing much faster crawling by Google, some caching of new pages, and general updates in their index across most of the datacenters. Perhaps the reduction in crawl/update speed in the last month might have been linked to the new sitemaps features introduced today and the new "cache" feature used by Google's bots.
At the moment there look like there are 2 main sets of results in the datacenters, and for my keywords they are only a little different - [22.214.171.124...] is an example of one set and the other is [126.96.36.199....] I expect within 24 hours all the datacenters will be showing one of the other.
I really like the newest set of results, though I'm still seeing a few old pages and semi-irrelevant forum pages ranking relatively highly. Anyway, I think we're going to see some significant improvements over at Google in the next couple of weeks.
And I definitely agree that Google is using different load-balancing servers that are updated seperately. While this is occuring occasionally on some datacenters, most datacenters seem to update their servers at roughly the same time.
Yes, there are two main sets of results. That has been the case for at least a year now (in fact I have lost track of exactly when that first happened).
There is also the "experiment" over at 72.14.207.* too (but not there all the time though).
I dont know, usually it seems like the two sets of results converge within about 24 hours after each round of updates.
>In my case, Google is mostly showing me results I like but they don't exist on any of the DCs we know!<
Same here Reseller. It's been that way for quite a while for me but it begin to conforn to other DC's in past 24 hours. Could start fluxing again. I rarely get results on default Google other than 188.8.131.52.
Matt Cutts Confirms Random Google's Serps... Google's Democracy!
Good morning Folks
2 cups of warm delicious Danish Brand Cappuccino! Great start of a wonderful day. And never forget.. Its great to be a live.
I'm a big Matt Cutts fan and an ancient Google lover, no doubt about it. Not to forget that I'm a big GoogleGuy fan too, ya know :-)
Being a big fan of Matt and GG means also that I read their posts and spend some time "decoding" what they say.
Last year just after update Allegra in February, some of us on WebmasterWorld start mentioning the possibility of Google deploying Random Serps or Rotating Algos. Then we forgot all about the two terms.
Lately, I read few but very solid statements of my hero Matt confirming the fact that at present we should expect Random Serps. Matt wrote something in the directions:
- Different datacenters get different data at different times.
- Different centers can rank things differently.
And that means: Search results are dependent on which time of the day and which datacenter you hit. I.e no consistency in Google serps any more. I.e you get serps in random!
Well... I guess we have to live with those random serps from now on.
The good news is; your site ranking might drop within some hours of the day or on some datacenters, but it might rank well during other hours of the day and on other datacenters!
By that way, every site on the web would have its own 5 minutes of fame. Equal opportunity for all sites!
Long Live Google's Random Serps!
Long Live Google's Democracy!
Wish you all a great sunny day.
I would sure hope this is not the case for Google. If this is in fact their theory...Bye Bye Google! The general public will not tolerate a search engine that can't make up its mind with it's results. Imagine, you just did a search for that "THING" you've been looking for and 2 days later you come back and do the search that found it. It's gone.....Hmm.....MSN...Yahoo... Here I come! WALLA! There's the "THING" I was looking for!
Google goes hand in hand with the word "Search" according to the general public. If they start noticing inconsistencies they are not going to pound their head against the wall until Google returns the result they are looking for. They will simply jump ship and Google will lose it's luster for the term "Search".
All Cutts said was that different data centers get data at different times. And that they can rate sites differently.
It seems that if different data centers have different data they should be expected to show different results.
But to spin this up into some idea that google is presenting random results is pure hyperbole. It is just flux and should be expected.
"All Cutts said was that different data centers get data at different times. And that they can rate sites differently."
And thats a very accurate definition of Random Serps!
"Google goes hand in hand with the word "Search" according to the general public. If they start noticing inconsistencies they are not going to pound their head against the wall until Google returns the result they are looking for. They will simply jump ship and Google will lose it's luster for the term "Search". "
Well... that have been the case since November-December 2005. I.e Google's serps have been in random for around 5-6 months. Have you heard any complaints from the general public ;-)
Google's Random Serps don't need to be something bad. Many publishers would be happy to see their sites ranking part-time instead of not ranking at all.
I'm just trying to find something posative about Google's Random Serps and adopt to them, as you might have noticed ;-)
| This 65 message thread spans 3 pages: 65 (  2 3 ) > > |