| 10:26 am on Feb 21, 2005 (gmt 0)|
Congratulations TJ but I am afraid that I am still stuck in the mire :(
| 10:38 am on Feb 21, 2005 (gmt 0)|
Thanks for confirming it's not the case then DBW.
Very odd though, what I see has actually affected entire sectors (mildly competitive to very competitive). One of them just increased in size from 11 million results returned to 20 million results returned.
I just thought it was too much to be a coincidence.
Could still be allegra of course.
| 11:29 am on Feb 21, 2005 (gmt 0)|
Lets pretend there is a capacity issue at google where it can only index a total number of pages.
Theres no room for new sites so they have to sit it out. Allegra and the 15th december update kicked a pile of pages into the state where only the URL was listed no title, no snippit, no cache. (or did it)
The increase in results would be because the half of them are in the notitle state? (would it know what they were about from backlinks?)
This would make room for some 'sandboxed' sites?
| 11:44 am on Feb 21, 2005 (gmt 0)|
Hi Grail - I don't think so.
If Google had run out of "space" in their index (and I don't actually believe that) it would be because they ran out of unique ID's for each page.
If a page is in there, title/description or no title/description, then it must have a unique ID.
Capacity in terms of space is dirt cheap. It's the unique ID per page that would cause a problem.
I still believe there are two indexes running which get merged in certain circumstances. I have no conclusive evidence of that and it's largely based on gut instinct. It appears to me, assuming that hypothesis to be correct, that sections of the two indexes just got permanently merged.
I'm not seeing this in one sector, all my sandboxed domains came out yesterday. And so did a whole bunch of everyone elses at the same time. Hence the title on this post, which is probably in the circumstances now a little misleading.
| 12:09 pm on Feb 21, 2005 (gmt 0)|
Yes I think my idea was probably not right having read your explanation.
Why would they want to run 2 indexes? Are they phaseing in 64bit datacentres?
| 12:24 pm on Feb 21, 2005 (gmt 0)|
|Why would they want to run 2 indexes? |
There could be any number of reasons for it, but given the effects of the sandbox, my personal view is a secondary index to act as a "cooling-down" centre for new domains as a means of preventing short-term spam.
The effect also sorts the wheat from the chuff. I only carried on plugging away at my sites as I'm passionate about them and can get good traffic without Google through viral and real-world marketing, inbound links and other SE's.
If I was less dedicated, or my sites really weren't that good or important, I would have given up long ago, and well before the site made it out into Google world.
While it could be seen in that way (quality control) the downside is of course a loss of "freshness" in the index, which, certainly 18 months - 2 years ago, appeared to be Google's primary consideration.
Many company's change policy after an IPO, however.
| 12:24 pm on Feb 21, 2005 (gmt 0)|
YES...our 60,000 page site just emerged from the sandbox yesterday...I tried to start a new thread on it, but I guess it was considered old news...so, yes, something big might have happened on Feb. 20.
| 12:26 pm on Feb 21, 2005 (gmt 0)|
many thanks for the tip off, just done lots of searches and indeed one of my sites is now showing dozens of internal page matches on the first page of serps (previously nowhere)
Could you possibly comment on whether it is site wide for you?
I am seeing older pages ranking but more recent additions still sandboxed, it's looking to me like pages come out of the sandbox as opposed to sites.
Rod (smiling for once)
| 12:29 pm on Feb 21, 2005 (gmt 0)|
|Could you possibly comment on whether it is site wide for you? |
Site wide for me across all my domains.
| 1:13 pm on Feb 21, 2005 (gmt 0)|
1. Many company's change policy after an IPO.
2. "cooling-down" centre
3. 64-bit data-centre roll out
I would really like to believe that the sandbox phenomenon is not the result of no.1 or 2. and would much prefer it to be the result of a 'problem'.
The ending of the 'sandbox' at least in part will at least bring some more 'evidence' of it's existence will it not? Because if the sandbox were never to end you could argue that a site in that state had 'something' 'wrong' with it. At least until people organised themselves into publishing a list of the affected.
| 3:04 pm on Feb 21, 2005 (gmt 0)|
I created a site 2/2/2005, submitted it to dmoz where it was accepted in 2 weeks. I also added a link from an existing site.
The site ranked #6 out of 800,000 yesterday eventhough I had mispelled a keyword in the title.
However today the site is buried and not in the top 500.
It was just a stupid little web site about a very popular TV show. Not a money term.
| 3:30 pm on Feb 21, 2005 (gmt 0)|
|Site wide for me across all my domains. |
Reading this I just had a bad feeling. Google only recently became a registrar and they announced that they had no plans to sell domain names but would rather use it to improve search results.
What if since Allegra they use the contact person information for each domain to see which domains are owned by one person/entity and apply the sandbox filter/penalty on a publisher level rather than a domain level? Looking at spam prevention it might be a very good solution. New sites from publishers who have already some authority sites on the net won't be burried anymore and sites owned by known spammers are sandboxed until it is sure that they contain quality content.
This would shake up the webmaster world enormous as many people now have some sites that are sandboxed and others that are not. If publisher data becomes the trigger for sandboxing you never know when all your sites are going to be burried, or comming out of the sandbox at once as happened to trillianjedi.
If this is true, we will probably see a new SEO technique: "contact info exchange" programs to replace the current "link exchange" programs. People selling their good name to get other sites out of the sandbox.
| 8:01 pm on Feb 21, 2005 (gmt 0)|
>use the contact person information for each domain
They had this for atleast 5 years. The zone files are easy to get.
| 8:15 pm on Feb 21, 2005 (gmt 0)|
A newbie question. How do you know your site or pages is out of sandbox? Do you get a maximum increase of visitors or you have good SERPS? Thanks!
| 8:38 pm on Feb 21, 2005 (gmt 0)|
If the sandbox is applied to certain sectors, could it just be that Google changed the list of sectors?
There is no evidence of the sandbox in any of the sectors where I have started sites in the last year. But I don't play in any areas that need "cleaning up".
Maybe they have decided that certain sectors have gotten cleaner, and have taken them off the list for now.
| 8:52 pm on Feb 21, 2005 (gmt 0)|
|How do you know your site or pages is out of sandbox? Do you get a maximum increase of visitors or you have good SERPS? |
One usually closely follows the other, so both ;-)
It was my log files that told me - I received 4,000 odd referrals from G. on Sunday, where I'd normally get under 100.
A quick search for some key terms then verified my immediate thoughts.
It was when I realised that all of my sites were simultaneously out that I considered it was a major update, and that seems at the moment to not be the case.
Interesting hypothesis Lammert. That could throw the cat among the pidgeons.