Welcome to WebmasterWorld Guest from 188.8.131.52
Very odd though, what I see has actually affected entire sectors (mildly competitive to very competitive). One of them just increased in size from 11 million results returned to 20 million results returned.
I just thought it was too much to be a coincidence.
Could still be allegra of course.
Theres no room for new sites so they have to sit it out. Allegra and the 15th december update kicked a pile of pages into the state where only the URL was listed no title, no snippit, no cache. (or did it)
The increase in results would be because the half of them are in the notitle state? (would it know what they were about from backlinks?)
This would make room for some 'sandboxed' sites?
If Google had run out of "space" in their index (and I don't actually believe that) it would be because they ran out of unique ID's for each page.
If a page is in there, title/description or no title/description, then it must have a unique ID.
Capacity in terms of space is dirt cheap. It's the unique ID per page that would cause a problem.
I still believe there are two indexes running which get merged in certain circumstances. I have no conclusive evidence of that and it's largely based on gut instinct. It appears to me, assuming that hypothesis to be correct, that sections of the two indexes just got permanently merged.
I'm not seeing this in one sector, all my sandboxed domains came out yesterday. And so did a whole bunch of everyone elses at the same time. Hence the title on this post, which is probably in the circumstances now a little misleading.
Why would they want to run 2 indexes?
There could be any number of reasons for it, but given the effects of the sandbox, my personal view is a secondary index to act as a "cooling-down" centre for new domains as a means of preventing short-term spam.
The effect also sorts the wheat from the chuff. I only carried on plugging away at my sites as I'm passionate about them and can get good traffic without Google through viral and real-world marketing, inbound links and other SE's.
If I was less dedicated, or my sites really weren't that good or important, I would have given up long ago, and well before the site made it out into Google world.
While it could be seen in that way (quality control) the downside is of course a loss of "freshness" in the index, which, certainly 18 months - 2 years ago, appeared to be Google's primary consideration.
Many company's change policy after an IPO, however.
Could you possibly comment on whether it is site wide for you?
I am seeing older pages ranking but more recent additions still sandboxed, it's looking to me like pages come out of the sandbox as opposed to sites.
Rod (smiling for once)
I would really like to believe that the sandbox phenomenon is not the result of no.1 or 2. and would much prefer it to be the result of a 'problem'.
The ending of the 'sandbox' at least in part will at least bring some more 'evidence' of it's existence will it not? Because if the sandbox were never to end you could argue that a site in that state had 'something' 'wrong' with it. At least until people organised themselves into publishing a list of the affected.
The site ranked #6 out of 800,000 yesterday eventhough I had mispelled a keyword in the title.
However today the site is buried and not in the top 500.
It was just a stupid little web site about a very popular TV show. Not a money term.
Site wide for me across all my domains.
Reading this I just had a bad feeling. Google only recently became a registrar and they announced that they had no plans to sell domain names but would rather use it to improve search results.
What if since Allegra they use the contact person information for each domain to see which domains are owned by one person/entity and apply the sandbox filter/penalty on a publisher level rather than a domain level? Looking at spam prevention it might be a very good solution. New sites from publishers who have already some authority sites on the net won't be burried anymore and sites owned by known spammers are sandboxed until it is sure that they contain quality content.
This would shake up the webmaster world enormous as many people now have some sites that are sandboxed and others that are not. If publisher data becomes the trigger for sandboxing you never know when all your sites are going to be burried, or comming out of the sandbox at once as happened to trillianjedi.
If this is true, we will probably see a new SEO technique: "contact info exchange" programs to replace the current "link exchange" programs. People selling their good name to get other sites out of the sandbox.
There is no evidence of the sandbox in any of the sectors where I have started sites in the last year. But I don't play in any areas that need "cleaning up".
Maybe they have decided that certain sectors have gotten cleaner, and have taken them off the list for now.
How do you know your site or pages is out of sandbox? Do you get a maximum increase of visitors or you have good SERPS?
One usually closely follows the other, so both ;-)
It was my log files that told me - I received 4,000 odd referrals from G. on Sunday, where I'd normally get under 100.
A quick search for some key terms then verified my immediate thoughts.
It was when I realised that all of my sites were simultaneously out that I considered it was a major update, and that seems at the moment to not be the case.
Interesting hypothesis Lammert. That could throw the cat among the pidgeons.