Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
To admit there was a capacity problem after all this time might be taken by some to be an admission of culpable negligence in their failure to advise potential investors regarding serious technical issues during their IPO period. I don't think there will be any culpable negligence issues because Google will not fail. However, if it did fail, I think the fact that they have withheld such information would make them subject to suit, perhaps even criminally if some of the ones who profited on the IPO were the ones who concealed the capacity problem. They would only be guilty of negligent deception IF THEY FAILED.
It's kind of like you wake up in the middle of the night and there is someone in your bedroom and they make a frightenting sound and it's dark and you see something flash towards you and you are so scared you shoot into the darkness to later find it's a serial killer wanted in a nationwide manhunt and you are a freaking hero and on talk shows everywhere, or for a change of scene, it's the neighbor's senile grandfather and you are doing 10 to 20 in max lockdown with Bubba Joe who likes to scratch his ass and sniff his fingers when he's not telling you how pretty your eyes are.
I don't have anything that would serve as proof of what I'm saying, but it pretty much stands to reason that if Google has been perfectly mum about the sandbox to this point, that they are not going to so quickly incorporate new and faster expanded technology at such a rate that it requires public statement.
If the sandbox phenomenon is over and/or in the process of ending, what would it likely look like? Would it be rolled out all at once? by topological area? by chronological time in the sandbox? alphabetically? by pages or by domains?
What will the results look like to us as they change? There must be tens of thousands of sites that have been released since last Winter that are sandboxed, as they take their place in the higher SERPs, will there be a mad assualt? or will it be more like a gradual infiltration? should we expect to see gradual changes in every area over time, steady like an hour glass, or will we see a week of dramatic change to be followed every couple of more weeks with dramatic change for a couple of months? or will we just wake up one morning to find that hurricane google has re-written the face of the internet with major devastation in it's wake and young hopeful sites seeing sunlight for the first time?
With MSN's new engine expected to go online perhaps as early as February, and google's known fondness for upstaging MS, how much later can they wait before they release the sandbox? The SERPs are apparently beginning to change. I've already heard of several people who've claimed their many-month-long-sandboxed site is out of the sandbox. Could it be that this is really the beginning of the end?
Here is my interpretation: it is Googles engine to do with as they wish. In 2003, affiliates has Google by the tail and directed it any way they wanted - atleast until Florida.
To suggest that google facing a tech hurdle and not disclosing that to investors is like saying NASA should have warned the public that space travel could be hazardous. Of course Google is facing tech challenges every day - all tech based websites are.
My guess is that your guess is exactly 100% correct. The only thing that might make it take longer than you project is difficulties implementing this on both the algo and the data center side.
I've been expecting something very much along the lines of what you are suggesting for quite a while, it's obviously been extremely difficult to implement the new matrix or it would have been in place months ago.
Google's total and utter silence on this issue, aside from the odd spin they throw out to keep a semblance of denial about the problem, and the upcoming solution, is, and has been, extremely eloquent. A silence that would be motivated by exactly the issues you layout. Nice post, somebody's got their thinking cap on today...
Nice post blasphemer.
I would say Welcome to Webmasterworld [webmasterworld.com] but I get the sneaking suspicion you've read it before.
Another question. Given that Google's propensity for trying to upstage MS is no secret, will MS try and surprise Google with an earlier than planned launch?
>> Is The "Sandbox" Ending?
That depends a great deal on what, exactly, you personally interpret as being "the sandbox" (*), as well as whether or not this phenomenon is intentional, desirable, or at least harmless, as seen from Googles perspective.
Anyway, it's most probable that it's ending, sooner or later. If not for anything else, then only because most things are ending. Also, most things are replaced by, or transformed into, other things in the process.
If i was to follow the tech problems argument for a moment, i wouldn't consider an architecture upgrade a problem outside the realm of "hard+software logistics" (which would offer some serious challenges after all, as we're speaking large volumes here, and it's basically the bread+butter of said firm). It would, however, fall into the "desired" box, although perhaps as "with some undesired sideeffects/bugs". Such a thing would be very gradual, and accompanied by a lot of testing. We wouldn't really notice it for a long time, and perhaps not at all, i'd say. At least the engineers should work very hard on making it non-noticeable (but on this particular forum, oddities do rise to the surface quite often). FWIW, that switch might already have happened, i'd say.
OTOH, the slow accumulation of "a set of webmaster problems that seems related" (or perhaps just increased awareness) during the past year seems to support the argument that some kind of process has been going on. An architecture upgrade is not the only process i can imagine yielding such effects, though. So, to end this speculation from my side; Even if we assume that we are in the middle of an architecture upgrade/switch, the interesting question is not if "the sandbox" will end gradually or instantly.
In stead, you should ask if this "feature / phenomenon" is an inherent part of the way the system is supposed to work, or not.
(a) If no, then this "thing" is a sideeffect, and my best bet on timing is "gradual", or perhaps in a few "jumps".
(b) If yes, then this "thing" is found in both architectures, and it will not "end" after the supposed switch.
For both scenarios, my best bet is that some pages/sites will "escape the sandbox" sooner than others, and that some will be "escaping gradually" every day for the forseeable future.
(*) I'm not trying to troll here, only emphasizing that there are numerous different things attributed to the term "sandbox" in different threads, by different members. We've all got "a general idea" but the specifics, as well as the theories differ a lot.
November 10, as noted by neuron in msg #131 here [webmasterworld.com].
yes I realize the number of pages went from 4 to 8 billion, that was tough to miss. Still, from some of the messages it seemed like there might be at least some areas where people weren't seeing much difference. I was just wondering if that was indeed the case. It seems not.
The surpise from MS to G i think it could be a desktop MSN search field in the new MS W1ndows products...that's what really scares G.
But i am sure G will answer back...
MS got in its 'house', G will get in MS 'house' too...
i expect more G desktop products.
And the 'game' or 'war' is getting harder and harder....
To admit there was a capacity problem after all this time might be taken by some to be an admission of culpable negligence in their failure to advise potential investors regarding serious technical issues during their IPO period.
This makes it sound like Google is responsible for indexing the world. Capacity is always a problem and never a problem. Certainly there is no guilt on anyone's part, to suggest so is absurd.
I don't think the sandbox will ever end - it's part of their migration to the idea of Hilltop. It is the introduction of Hilltop ideas that make it more difficult for new websites to rank quickly. PageRank, LocalScore, SEO, the rules have changed slightly.
I also once posted my theory about two indexes. The SERPs I see and the experience I have with a new website can all be explained by Hilltop. It may be two physical indexes or it may be virtual, but when google does not find Hilltop results, it returns "old" results and that is why some new site rank on uncommon queries - not queries that have less than 500,000 results as some suggest.
My sandboxed pages show up fine in Google if you search for something super specific that exists on no other page. They are in the index, they just don't get displayed in the SERPs.
If Google really wanted to keep the index smaller, they'd kick out the PR0 pages.
My sandboxed pages show up fine in Google if you search for something super specific that exists on no other page.The capacity problem takes that into account. The pagerank calculation matrix can handle 4 billion urls. That is the real index. Another index of 4 billion urls is there for backfill. Those in the index just not the right one for ranking against others that are in the right index. That is the simplified version of the theory.
Does Hilltop explain how the same content on a new domain, retaining an identical set of backlinks, can have dramatically different results?
Yes it does, this has to do with content of the site itself and recognizing it as an authority site. New sites are having trouble being recognized as an "authority" for the topic on which they are hoping to rank. That is where we are seeing a delay; it is very likely that Hilltop, as deployed by Google, will not allow sites to immediately be recognized as an authority. Quite frankly, this makes a lot of sense 90% of the time.
When authority cannot be established for a query, then the Hilltop based algo cannot be used. This is why new site rank for "oddball" queries where a corpus of information does not exist. Hilltop is bypassed in these situations. This gives the appearance of an index with two sets of results.
I don't buy into the theory that the Sandbox is a capacity problem.
In today's computer environment, there is no such problem as capacity. That is what I meant earlier. It is only a problem if you make it a problem. I've helped build very large databases and they all have problems and everything is a compromise - speed versus complexity of queries. You can optimize a database for anything you like, it is just that you cannot optimize for everything at that same time. Again, to say that Google is negligent is nonsense.
Granted the Google link report is broken but it's interesting to compare the other engines (which are pretty consistent) to MSN BETA which produces 7-12 times more links than what the others list.
The sandbox doesn't prevent new pages from being added to the index.Others would say "The sandbox doesn't prevent new pages from being added to an index."
New sites are having trouble being recognized as an "authority" for the topic on which they are hoping to rank
I agree with your logic but Iím just not seeing that in the current SERPs.
is it possible that Google are not recognizing any authority site?
It might be they are only listing "expert" site.
[edited by: GerBot at 4:33 pm (utc) on Dec. 29, 2004]
I am therefore convinced that the Google sandbox is a deliberate ploy on their part to increase revenues. Yahoo bought Overture and their revenues skyrocketed. Investors were thrilled. They didn't care about Yahoo's "ink contaminated" results, just the top line.
Google had to respond and get their revenues up and what better way than to force new players to buy Adwords. Don't for one minute think that the people at Google think strategically. That is reserved for privately owned companies. As a publicly traded company, the top line is as paramount as the bottom line. Their long term thinking do not extend beyond the next shareholders' meeting.
In the short term, why should they free up sandoxed sites that are buying adwords? That would be tantamount to shooting themselves in the foot.
As a competitor, if G had only Yahoo to contend with, it would have been like a walk in the park. Yahoo is the search equivalent to K-Mart who no competitor loses any sleep over.
However, MSN search is another kettle of fish. Firstly this is hardly core business for them and they have the luxury of focusing on "owning" the market in the medium to long term.
These are not the jokers they have at Yahoo. Their aim will be become the Wal-Mart of search, and they will brook no competition. When Wal-Mart for instance, got heavily into toys, specialist toy outlets like Toys'R Us operating close to them, were forced to close.
These are strategic people with deep pockets who do not need to focus on short term profits. They will roll out good, fresh results and aim to do what they do best, "dominate" the market.
MSN is a Tsumani waiting to happen. If I were Google, I would be very nervous indeed. However I'm sure they are not, resting on their laurels, hoping that MSN is noting more than a bad dream.
In the short term G in my view won't give two hoots about their stale results. That is until searchers begin abandoning them in droves. It is then and only then will they will get into a crisis mode, open the floodgates and pray that it is not too late. In the meantime, they will be guided by short-term profits.
Don't expect the sandbox to end anytime soon.
Just my 0.02.
Here are the top ten WebmasterWorld responses to the sandbox theory:
10) The Sandbox? LOL ... doesn't exist!
9) The Sandbox? You mean the LAG?
8) The Sandbox? You mean Hilltop?
7) The Sandbox? You mean Florida?
6) The Sandbox? What a clever invention!
5) The Sandbox? Mwuahahahaha ... it exists but I'm out of it!
4) The Sandbox? Mwuahahahaha ... it exists but I never got into it!
3) The Sandbox? How can Google be so stupid as to do such thing in the first place?
2) The Sandbox? Wow, yeah, <whisper>Google must have technical problems.<whisper>
1) The Sandbox? Huh?
The number of indexes has never really supported the lack of capacity argument. Say that Google had all the capacity in the world - nothing would stop them from having two indexes, ten indexes, or even a hundred if that would serve their purpose better.
The more capacity they get, the more indexes they can possibly make. Sheer storage capacity would do it, regardless of 32 or 64 bit technology.
>> FWIW, that switch might already have happened, i'd say
(my post above, about 32-to-64)
I'm having second thoughts on this. Of course it might still have happened already, but i'm not really sure that Google would find such a shift worthwhile, at least for the time being. There's a couple of things to it:
Not that i think anyone can answer this set of questions just like that, but it was just the few things that popped up instantly when i gave it a moment of thought, so no doubt the people at Google have given these matters some consideration. I'm not sure the benefits outweigh the negatives, but then again, i don't run systems of that size, and i haven't seen the numbers (if such numbers exist), so i might be entirely wrong.
- that's probably not the most important point, anyway, although these extra costs will occur every day.
[edited by: claus at 5:58 pm (utc) on Dec. 29, 2004]
I have four sandboxed sites. All are doing great in MSN Beta so I'm certain that my SEO strategy works just fine.
You're using a engine that is currently in beta to validate your SEO techniques. Talk about wishful thinking.
Their long term thinking do not extend beyond the next shareholders' meeting.
Hmmm...ever hear of Gmail, Google labs, blogger....
They will roll out good, fresh results and aim to do what they do best, "dominate" the market.
There are plenty of examples of companies that are still around despite MS attempting to dominate the market. Ever hear of Intuit or Adobe?
It's always amusing to see someone's interpretation of things which are (in reality) only theories. MSN may come on strong if and when MS is ever able to release longhorn with embedded search. Hell, if you want to talk about compnies resting on their backsides take a look at the dismal state of Internet Explorer. Losing 15-20 million user's to firefox may be a drop in the overall bucket, but there's no denying it's made a dent in the browser wars.
MS is not, nor has it ever been an innovator. As oong as google (and others) can continue to innovate, MS will be at a loss, as it is now an old, slowly turning juggernaught that has never had a single original idea of it's own.
If you're staking the future of your business on MS, you've got a long wait.