Forum Moderators: open

Message Too Old, No Replies

Sandboxed Sites - Back Together?

Do they come out together or one by one?

         

McMohan

10:09 am on Nov 20, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Most of the new sites that I work with are still in the sandbox. Was just curios to know, if all the sanboxed sites come out of the sandbox during one fine major updation or one by one, over the rolling updates?

That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)

Thanks

Mc

BeeDeeDubbleU

3:31 pm on Dec 4, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



We are all to blame for the problems we have now, it is not googles fault but our own.

Speak for yourself. All of my websites are legitimate and provide useful information and services. NO Directories or scraper sites.

What you are also forgetting is that Google claims that "Google's mission is to organize the world's information and make it universally accessible and useful."

"Universally accessible?" I don't think so.

createErrorMsg

3:42 pm on Dec 4, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



With the advent of every chancer now building sites, new sites probably are:

1) 75% content scraped affiliates with duplicate content
2) 5% doorways to existing sites
3) 10% rehashed content with little value
4) 9% bizarre ramblings
5) 1% fresh and original


Yes, new knowledge is rare and any new knowledge can be served by qualified sites already listed.

These two comments are sweepingly moronic. More new sites are being built because more people are realizing that they can contribute something to the web, be it a content site, a business brochure, a sales letter...the key to remember is that this is what we want. This is the whole raison d'etre of the web: to disseminate free knowledge.

To claim that it is unproblematic for only old, established sites to contribute to that dissemination is to assume that only individuals who realized they could contribute PRIOR to May 2004 (or whenever) have anything useful to say.

Are you saying that because someone did not launch their web site four months earlier than they did that they have nothing to contribute to their niche? As if timing has anything to do with knowledge? Please.

Obviously, many (probably most) existing, indexed, SERP-populating pages are regularly updated. Spend any appreciable amount of time here on WebmasterWorld and you can see that most site-runners know the importance of new content for staying at the top. But just because www.establishedWidgetSite.com updates their widget information every day, doesn't mean that www.brandNewWidgetSite.com doesn't have anything new or valuable to offer.

And it certainly doesn't mean that search users don't deserve to have access to that information. Whatever the cause of the sandbox effect - antispam (doubtful), mistake (doubtful), secondary-tertiary index (probably) - the bottom line is that searchers suffer. You go online to find as much relevant information on your search as you can. You use a search engine assuming they are making every effort to deliver those relevant results to you. Only, behind the scenes, it turns out there's a whole 'nother internet out there that isn't showing up in your browser. Disappointing, to say the least.

cEM

MHes

4:36 pm on Dec 4, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"Are you saying that because someone did not launch their web site four months earlier than they did that they have nothing to contribute to their niche? As if timing has anything to do with knowledge? Please. "

I take your point, but it is not for a spider to decide what is good or not, it is for other humans. Google made the mistake of ranking new sites too quickly and filled the serps with rubbish.

" As if timing has anything to do with knowledge?"
Exactly - New sites does not mean they are any good. It is illogical to rank them above other sites that have pedigree just because they are new. Quality would suffer if you only showed new sites without the human process of 'voting'. These votes, in the form of links, need to be evaluated very carefully and this takes time.

There are practicalities that need to be accounted for:

1) There are only 10 spots on the first page. If you had a policy of only showing new sites you would last ten minutes before another 100 new sites came online. Logic dictates you only show the best as percieved by quality of links in as you cannot possibly show all.

2) Spiders are dumb. Therefore you have to wait for people to vote and make sure those votes are real.

"the bottom line is that searchers suffer."
And they will continue to suffer unless the human voting system is rigourously applied. Older sites that are already in will drop without votes, new sites won't get in without votes. Thats the situation now, the effects take time to surface but quality will result.

" it turns out there's a whole 'nother internet out there that isn't showing up in your browser."

They are there, just very deep. As a searcher, you want sites that others have found useful and voted for, these will usually be older sites. At present there are sites ranking well based on corrupt votes, but they will drop as the new system takes over.... its called hilltop.

eyezshine

8:25 pm on Dec 4, 2004 (gmt 0)

10+ Year Member



"but they will drop as the new system takes over.... its called hilltop."

I have started to notice the supplimental results are ranking higher than some of the regular results. My "Sandboxed" sites are beginning to get a trickle of traffic from google.

This is a good sign.

BeeDeeDubbleU

1:25 am on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It is illogical to rank them above other sites that have pedigree just because they are new.

I have had a few beers (well it is Saturday night!) but here goes.

It is not illogical to rank them above other sites if they are better than the other sites. To claim otherwise is just plain stupid. The Internet is based on democracy (I think?) Search engines should put the best sites at the top of the rankings, new or old. If they cannot do that then then they have failed. They may be making their shareholders happy, they may be making money but they have still failed.

(Did I do OK with a few pints of Abbot ale and Guinness in me?)

Powdork

2:12 am on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



They are there, just very deep. As a searcher, you want sites that others have found useful and voted for, these will usually be older sites.
Perhaps if you are searching for widgets. But if you search for a company, restaurant, candidate, charity, etc. by its name, even the joest of joe surfers will hope to actually find that company, restaurant, candidate, charity, etc. somewhere on the first page of results. When other engines do present these entities on the front page of results it is and should be an embarrassment for Google.

Vec_One

2:34 am on Dec 5, 2004 (gmt 0)

10+ Year Member



It has occurred to me that Google might want to turn off the tap long enough to manually remove some spam sites. That's the closest thing to a logical explanation I've come up with. With millions of sites and billions of pages though, they surely wouldn't try.

I once tried to significantly reduce the amount of email spam our company was receiving. I spent way too much time creating server-side message rules, etc. I eventually came to the conclusion that the web is much too big to hold off manually. I'm sure Google figured that out a long time before I did.

Conclusion: There is no apparent rational reason to discriminate against newer sites.

Personally I figure there is a dampening effect on new links and the knob is turned up too high.

BeeDeeDubbleU, you did very well. :)

Imaster

5:50 am on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



MHes, do you work for Google? I haven't seen someone defend with so much enthusiasm something that is obviously defective :)

eyezshine

7:02 am on Dec 5, 2004 (gmt 0)

10+ Year Member



I still think google has a bug. I think they are working on it but it's taking way too long. They just need to take some cash and rebuild their entire index so that it can index at least a trillion pages and be done with this whole fake index.

I bet MSN comes out with a way bigger index than google's wimpy fake 8 billion page index when it goes live to the public. The beta is just a test with a small amount of pages. The way they've been spidering makes me think they have a much bigger capacity than google.

brixton

9:52 am on Dec 5, 2004 (gmt 0)



"I still think google has a bug. I think they are working on it but it's taking way too long. They just need to take some cash and rebuild their entire index"
:-) :-) :-) HO HO HO
is good to read some fun msg's sometimes:)

MHes

11:28 am on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>MHes, do you work for Google?

No, Google works for me :)

EarWig

11:36 am on Dec 5, 2004 (gmt 0)

10+ Year Member



"Exactly - New sites does not mean they are any good. It is illogical to rank them above other sites that have pedigree just because they are new."

But a huge amount of OLD sites that appear in the Serps are NOT pedigree.
It is logical to remove those that aren't but this will never be achieved by an algo - only by human intervention :)
EW

JuniorOptimizer

11:51 am on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Ahh, the poor surfer. Every day we here the cries of the average surfer: "Oh no, Google is stale. No sites with Whois information of March 2004 of newer are in the index. How stale".

Oh wait, that's not a web surfer's lament, but rather a webmaster's.

BeeDeeDubbleU

12:11 pm on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Exactly - the complaints are coming from those who are best qualified to comment!

As I said earlier, Google claims that its mission is "to organize the world's information and make it universally accessible and useful." It is not now doing that but it has made no statement or admission about this so it clearly feels that it is above the law in this respect.

Morph

12:23 pm on Dec 5, 2004 (gmt 0)



My site appears to have been sandboxed for the last 12 months! However only for one specific Keyword. I have a heavily optimised site - with a large selection of relevant backlinks and a great deal of keyword specific anchor text and general on-page SEO.

I only have two or three real competitors and the rest of the results are just single pages that mention the term once, some even geocities sites that rank higher than me!

I rank well for one term, but not at all for the other. I’ve just never been ranked at all in Google for the second term. Seeing as there are only two sites that come close to my SEO levels and anchor text then I should be on the first page and both Yahoo, MSN and every other search engine agrees, apart from Google it would seem. My current search engine breakdown is like this:

Term 1

Google: 1st
Yahoo: 2nd
MSN: 2nd

Term 2

Google: Not ranking
Yahoo: 6th
MSN: 10th

I started my site in Dec 03' and I am still not visibly ranking for the keyword (at least the first 1,000 results in Google). Now this is really strange because I have the most anchor text (of the term), back links and PR out of all the sites targeting that term. What makes it more frustrating is that some sites that don’t even mention the word on their site, have any authority rank, or have any PR are ranking higher than me!

At first I thought my site was a good example of the so called 'sandbox effect', but I just can't believe that Google could sandbox my site, for this term, as long as it has.

JuniorOptimizer

12:45 pm on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The theory that Google is doing this to force people into AdWords just seems too heavy handed.

The only theory that makes any sense to me is the "Age of Links" theory. At some point they built a trusted database of links, and if you have newer links, your site is trusted less.

And it isn't ALL new pages that don't rank either, so assuming this is all planned behavior be Google seems unlikely.

MHes

12:53 pm on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"to organize the world's information and make it universally accessible and useful."

This is what google is doing, you just happen to be at the bottom.... someone has to be.

BeeDeeDubbleU

1:07 pm on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Excuse me, who said that I was at the bottom? My own sites are all doing very well thank you. There will be turkey on the table for my family this xmas and any work I have done for myself in the last six months has been done safe in the knowledge that Google would not be in the equation.

My problem is that I get annoyed for my clients, who incidentally have NOT paid me for optimisation, and I have no obligation to them with regard to SEO or their sites' ranking. But when I build interesting websites for others I believe that they deserve a fair crack of the whip. Currently Google is not giving them this.

"GOOGLE, ORGANIZING THE WORLD'S INFORMATION. ALL THE WAY TO 2003!"

BillyS

2:44 pm on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



But when I build interesting websites for others I believe that they deserve a fair crack of the whip. Currently Google is not giving them this.

This points out a "first mover" advantage these older sites now have. Why did your client wait so long?

I view this as a test of wills. I am not going to hide or give up just because my site is still not ranking well. I've looked at the competition and improved on their offerings. Google will eventually recognize this.

europeforvisitors

4:42 pm on Dec 5, 2004 (gmt 0)



I am not going to hide or give up just because my site is still not ranking well. I've looked at the competition and improved on their offerings. Google will eventually recognize this.

Good thinking. IMHO, there's a tendency here to focus on the short term instead of the long view. Even if Google does have a lead time of six months for new sites (or new commercial sites, as the case may be), that's fairly inconsequential in the overall scheme of things. And in any case, it's likely that the "sandbox," if it does exist, is a temporary phenomenon rather than a permanent fixture of Google Search.

createErrorMsg

4:58 pm on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Ahh, the poor surfer. Every day we here the cries of the average surfer: "Oh no, Google is stale. No sites with Whois information of March 2004 of newer are in the index. How stale".

Oh wait, that's not a web surfer's lament, but rather a webmaster's.

Not to be melodramatic, but a person who doesn't know they have cancer doesn't lament about not getting cancer drugs, either. Doesn't mean they don't need or deserve them.

But that's over the top. How about a less extreme example (and perhaps closer to home)...

Say a web designer makes a website for a client. The web designer, being in the business, knows all about the importance of keyword placement, semantic markup, seperation of structure and style, anchor text, ect...all the things that are general best practices for designing a page. But they don't use them. instead, they throw together a nested table, spaghettified, un-optimized peice of tagsoup trash and sell it to the client.

Now, on one hand we could say that the client deserves their trash-site, because they should have known enough to look into how the designer makes the page, but let's assume for a moment that they DID look into it, and it turns out that all the designers literature, all their portfolio peices, all their references, say that they design within modern standards. It says all this because up until recently, the designer in question DID use all the best practices. They really did deliver on their promises. It's just that recently, they've started making inefficient tagsoup.

And, of course, the client doesn't know. They get their page put online. They can load it up in IE and look at it. They think they got the optimized, standards compliant page they were promised.

The question is: doesn't the client in this case deserve a well-made page? And hasn't a wrong been done to them, their expectations not met, despite the fact that they never know to complain about it? They were told they would get a well-made page. They are paying for a well-made page. They should get a well-made page.

In the same sense, a wrong is being done to searchers, who patronize the Google search engine because they believe/are led to believe/want to believe that it will provide them with accurate and comprehensive results for their search. If you disagree that this is what Google claims to offer, okay. We can talk about that. But if we agree that this is at least Google's implied promise, certainly searchers are being let down, whether they know it or not, whether they ever complain about it or not.

cEM

MHes

5:23 pm on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>doesn't the client in this case deserve a well-made page?

If he paid for that ....yes

Google is free. If you pay, you get the position you want.

europeforvisitors

5:45 pm on Dec 5, 2004 (gmt 0)



Google could just as easily argue that the sandbox is designed to improve the accuracy and value of search results to the user by temporarily filtering out sites that have the highest statistical probability of being fluff.

To echo what another member posted, if there is a sandbox, it exists only because SEOs and their clients made it necessary.

Personally, I think a sandbox is the wrong approach; I'd much rather see users have a choice between searches that are weighted toward information or commerce. Having one massive, undifferentiated index may have worked in the early days of Google, but it's simply too unwieldy to be practical now that the number of pages on the Web is greater than the earth's population.

BeeDeeDubbleU

5:51 pm on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Even if Google does have a lead time of six months for new sites (or new commercial sites, as the case may be), that's fairly inconsequential in the overall scheme of things.

Where did you get six months? I would settle for that.

Powdork

6:03 pm on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google could just as easily argue that the sandbox is designed to improve the accuracy and value of search results to the user by temporarily filtering out sites that have the highest statistical probability of being fluff.
I remember reading that study that showed sites developed after May have a higher statistical probability of being fluff. Oh wait, thats right, I didn't read it because it doesn't exist. In my neck of the woods its the old sites that suck, they've been on top so long that now everything on their pages is an ad or something copied from a public domain government site.
Lets also keep in mind that Google hasn't said it is an effort to produce results. They have not commented on their brokenness at all. If you send an email regarding a site in the sandbox, the reply will not acknowledge such a thing exists nor will the word be mentioned.
The sandbox effects many searches that do not have an excessive number of 'correct' results. IMO when a search resturns 500,000 results there are typically less than 20 results that actually answer the surfers query. Google usually finds all of these that have been optomized (I simply mean made readable by spiders here) except when they are new. And personally, I think its gone beyond that. Six months isn't new anymore.

steveb

6:14 pm on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"if there is a sandbox, it exists only because SEOs and their clients made it necessary."

There is maybe a 2% chance of this. The sandbox exists because Google has had a massive failure in its handling of data. The idea that the sandbox has anything to to with seo is laughable.

All last year Google trumpted how fresh it was... fresh tags, constant updates, etc. This was a porr idea from the very start, as "fresh" never meant "good". However, to think that Google's response to this phenomenon is to not rank new sites but rank most new pages is to believe that Google has the brain of a two year old. They are in the business of being a SEARCH ENGINE. It is their job to be able to discern the new official site of a famous person from the reams of scraper sites that go up every day. That is what they do. To think that they are intentionally choosing to hold back official sites while letting 5% of the scraper bilge through makes no sense on any level.

SEO may be responsible to for Google's bizarre dramatic downgrading of authority in its algorithm, but other than that, seo has little to do with where we are. Google's data collapsed around February. It was noted here at the time. They were in the process of making the best serps they have ever had, and then "poof", it was all gone... and we go to all anchor text (almost) all the time again.

dvduval

6:34 pm on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



And to add to what steveb is saying, Google has remained 100% silent about the sandbox affect, and and have not even acknowledged its existence, but we all know there is a filter in affect. This filter may not be purposefull (as in an algorithm shift).

We all know that there were times when Google had delayed updates, and we all remember how everyone freaked out when an update took more than a month. Then GoogleGuy told us about quarterly updates. We also know that the Google index has reached 8 billion (but includes the main index and the supplemental index). We also know that Google is using a supplemental index (which seems to be a throwback from Inktomi btw).

Is it possible for them to calculate PageRank through both the main index and the supplemental index?
Keep in mind that to calculate PageRank they must do several iterations through the entire index. I think we are getting what I will define as PageRank Lite. They just can't keep up, and are now working through problems (that they will likely solve), but cannot tell the public because it will affect their stock price.

europeforvisitors

6:47 pm on Dec 5, 2004 (gmt 0)



And to add to what steveb is saying, Google has remained 100% silent about the sandbox affect, and and have not even acknowledged its existence...

Makes sense. I'll bet they wish they'd never acknowledged the existence of PageRank. :-)

Spine

6:53 pm on Dec 5, 2004 (gmt 0)

10+ Year Member



I think there are more than enough dots to connect here indicating that there is a problem, as Steveb says, with the handling of data.

I've never seen so many oddities and anomalies in the results before what started showing up this fall.

createErrorMsg

8:31 pm on Dec 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>doesn't the client in this case deserve a well-made page?

If he paid for that ....yes
Google is free. If you pay, you get the position you want.

MHes, you miss my point. I was talking about surfers, not webmasters. Surfers cannot pay to get a better, more accurate, and comprehensive SERP.

cEM

This 472 message thread spans 16 pages: 472