Forum Moderators: Robert Charlton & goodroi
If your site is less than a year old you are likely sandboxed.
I can't believe most sites under a year's age are in some sort of penalty box. Google would be useless. So, I want to know:
1. Are all sites sandboxed, or do certain traits (like affiliate links, low content) trigger it?
2. How long does it last?
3. How variable is the duration?
4. How do you know your site is being sandboxed?
5. Does the effect taper off or is it a binary thing?
6. What gets you out of the sandbox? Is it merely time or do good links or whatever speed it up?
Thanks.
Google's job isn't to provide instant gratification to Webmasters and SEOs.
No, but their job is to organise the World's information. Wouldn't new information be considered very important?
JustGuessing is on the money, Google created the situation with Adsense. It's their fault. MSN and Yahoo who people in here are always happy to condemn are also suffering from the effects of Google's monster. Google are contributing more than most to the destruction of the Internet with Adsense in their greed to earn as much as they can as quickly as they can.
They could stop this if they wanted to by allowing people to turn off directories the way they allow porn to be turned off but clearly they don't want to. these sites earn them big bucks. They could also employ a small army of people to police the net and manually remove offenders but they chose not to.
Think about it. They can afford to assign two or three people to each of the main spam categories and get them to start removing the bad sites. I think I myself could make a big impression in the travel category in one week. Guess what? They chose not to because the spammy sites are generating millions for them.
However, it's unlikely that Google's SERPs would be spam-free if "made for AdSense" scraper sites and the like didn't exist. It's also unlikely that Google intends the sandbox to be its long-range solution to the flood of spam.
As for employing human "quality evaluators," Google is already doing that; the jobs have been advertised at Google.com. But Google prefers scalable algorithms to a manual whack-a-mole approach, so it's likely that those quality evaluators will be used mostly to help Google's engineers, not to staunch the flood of new scraper directories and boilerplate product pages by clicking their "delete domain" buttons all day.
Finally, it's true that some--not all--useful sites will wait longer to get indexed than they would if the sandbox (and the reasons for the sandbox) didn't exist. But let's be realistic: Most genuinely useful sites, whether they fall under the heading of "information" or "commercial," aren't created with instant Google referrals in mind.
I still think you're focusing too much on the effects rather than the cause. I think Hilltop, LocalRank, etc. were implemented to increase the accuracy of G's results and filter out spam sites. The result of that was that SOME (certainly not all) ecommerce and affiliate sites took it on the chin. It also bought G some time to tweak it's algo to do a better job of eliminating spammy results. Obviously they haven't figured it out yet. Some would argue that they will NEVER be able to eliminate spam sites effectively. But, if they get their algorithm to a place they feel comfortable with, I believe the sandbox will open up. There may always be a "lag time" during which new sites must prove themselves, but at least there would be a light at the end of the tunnel.
I don't believe the algo implementation was done to herd people into the AdWords camp - I think that's just the conclusion many people are drawing. It's an understandable conclusion, but that doesn't mean it's accurate. It also doesn't mean it's NOT accurate, but I'm going to give G the benefit of the doubt.
You see here is my feeling for all those people that like the conspiracy of it all.
If you penalise webmasters that have been doing nothing but make good sites. All their time, money and effort is in making their site good for visitors and search engines.
Google punishes them and says 'use adwords' even though they have no budget and no visitors mmmmm.
So that being said, if your site is ranking highly and you are 'up and running' you are probably in more of a position to pay up for 'adwords'.
If I am a greedy capitilist that wants to boost my profits, I'd rank these 'sand boxed' sites highly (only too desperate to earn money the webmasters put adsense on their site) - the big sites pay for adwords and everyone is a winner.
Maybe that is just wishful thinking, one thing is for certain - it does not make much sense to me. I am in the category of 'made lots of changes now my site is boxed'
If I was to continue to build the site for the next 10 years I will then have a case against Google. I.E. they do not manipulate the results - well 10 years later i'd have enough information to make an argument against them.
So longterm it is highly unlikely they would penalize a great site for no other reason than 'they made some changes'
I believe it is a fight against spam, I believe that it makes no odds wether I am going to rank or not. I am going to build a great website over time - and thats that. With that being said, Google will put my site into the serps one day.
I think we should stop talking about nailing sandbox using opinions and restuls. I think we should create an action plan. We work in virtual groups and build a test scenario - we got a lot of good knowledgeable people, we got a lot of sites that we could test theories with.
Some of us investigate Hill Top and perform scientific results within this virutal team.
Some of us investigate Sandbox.
Some of us analyse sites not effected.
Some of us do this and some of us do that.
Eventually we will have compelling information and a true representation of the facts. My site is available, and my free time is available.
I also used to work with with national newspapers but I would suggest its not worth trying to educate them. If this is a crazy far fetched suggestion I have put to you all then I have only this to say:
My site is not ranking on Google for the keywords I think it should. I have an idea, if I find another 11 people similar to me and we work like made on each site for 1 month a year (thats 11 people working around the clock to build a killer site) then within 12 months if its not working we merge those sites and build an authority on '12 Search Engine Consultants sell their knowledge in one easy to read e-book' Download now for only £20 - click our buy now button.
Any takers please PM me lol
Google punishes them and says 'use adwords' even though they have no budget and no visitors mmmmm.
When exactly did Google say that?
Getting back to suidas' post (#108) about the exclusion filter:
I tested this myself and it is interesting to note that as you add more exclusions to the search the number of results returned decreases. Not sure exactly what this would indicate...
I've said it b4 and I believe it bears repeating:
Several months ago I analyzed 30 sites I perform SEO on. 7 of them ranked well in Google and 23 did not. After more than a month of research into the differences between the sites (they were all built around the same template) the ONLY significant difference was that the 7 sites that ranked well in Google had also been listed in the Yahoo directory for a year or more.
Now this is not to say that Yahoo itself is "the answer" - but this certainly aligns with Hilltop. If the Google algo believes a Yahoo directory category is an "authority" on the specific topic you searched for, AND a site has a link from that Yahoo directory category, they have a good chance of ranking well on that term.
There are also numerous sites that rank well in G without being listed in Yahoo - but I bet a buck those sites count links from other "authority" sites among their inventory.
Conclusion: figure out which sites Google deems to be the authorities for your search term, and get links from them.
When exactly did Google say that?When you email them and ask why your content which used to rank tops for lots of keywords whilst on a previous domain now doesn't rank in the top 1000 even though you followed the advice of a google representative when implementing the 301's and you follow the guidelines for webmasters that you know the email is invariably going to link to.
As for employing human "quality evaluators," Google is already doing that; the jobs have been advertised at Google.com. But Google prefers scalable algorithms to a manual whack-a-mole approach, so it's likely that those quality evaluators will be used mostly to help Google's engineers, not to staunch the flood of new scraper directories and boilerplate product pages by clicking their "delete domain" buttons all day.
Europe, who mentioned quality evaluators? I suggested spam nukers not quality evaluators. Google continues to tell us that it prefers scaleable algorithms and they continue to fail to produce the goods. Algorithms are written by humans and algorithms are sussed by humans.
Sooner or later the penny will drop. Algorithms can be used to cope with the run of the mill stuff but if quality results are required manual intervention is a necessity and that is a an undeniable fact.
The solution is an algorithm backed up by manual intervention. When this occurs and the spammers see that they are in grave danger of being dropped this WILL stop. The Internet is the most corrupt and unethical business that has ever existed. Only strict policing will change this.
In my searcher I cannot find you anywhere but on my homepage your are at the top, is something wrong?
Her 'searcher' is Google and when she logs on her 'homepage' is MSN!
Google may feel comfortable ignoring us but can they ignore people like the above? Very unwise long term
Conclusion: figure out which sites Google deems to be the authorities for your search term, and get links from them.
Yes, precisely my experience as stated on the first page of this thread:-)
Europe, who mentioned quality evaluators? I suggested spam nukers not quality evaluators.
Somehow I can't see an HR department using a job title like "spam nukers," but why not? :-)
Sooner or later the penny will drop. Algorithms can be used to cope with the run of the mill stuff but if quality results are required manual intervention is a necessity and that is a an undeniable fact.
Well, we know that they already do some manual intervention. I'll agree that more would be better. It probably isn't realistic to expect human spam nukers to keep up with the flood of new pages from "button pushers," though (to use a term from the Webmaster Supporters Forum).
The solution is an algorithm backed up by manual intervention. When this occurs and the spammers see that they are in grave danger of being dropped this WILL stop.
How so? Won't they just hit the "create multiple sites" button and crank out another batch of disposable domains? Especially if there's no sandbox to prevent instant profits?
The Internet is the most corrupt and unethical business that has ever existed. Only strict policing will change this.
Maybe Google needs to change its motto from "Do no evil" to "Allow no evil." :-)
Are they all from similar big money keyword categories such as travel/hotel/airline, insurance, pharmacy, finance/credit/debt?
Are there any others like books, video, music? Does legal rank up there?
I have no idea however are all or most of the sandboxed sites from the above?
I await with interest the list!
How so? Won't they just hit the "create multiple sites" button and crank out another batch of disposable domains? Especially if there's no sandbox to prevent instant profits?
They will try but humans spam nukers would surely be able to track this. I don't want to work the issues but they don't have a different IP or a dedicated server for all their domains do they? ;)
Incidentally as someone who believes that the Internet has the potential to be one of the most beneficial inventions of all time being a spam nuker to me looks like the type of work that could give me much job satisfaction :)
I can just here the cries of, "Take that you button pushing B******!", as another one bites the dust.
The Internet is the most corrupt and unethical business that has ever existed. Only strict policing will change this.
Just so long as it complies by whose defined rules?
America's? China's? Europe's? Arabia's? Hey, no, let's do it the Russian or Nigerian way...
It takes long enough for a few people to decide about new TLD names, just imagine how many "jobsworths" this would create!
If, by creating another supervisor you believe it will solve the problems, just check out the corruption in the European Union...now there you will find THE most corrupt and unethical organisation EVER created!
They make the former USSR look like a kindergarten and who says so? Their own auditors, that's who!
Take that you button pushing B******!
I want that job!
Now just how would that job description be phrased? The mind boggles:-)
At least with Dmoz their sandbox is human operatored. The good thing with this is you know its about duration - sooner or later you know your site is either a) listed or b) dropped.
With Google its a case of the unknown - I think Google engineers and anyone working for Google should at least take on board the serious side of who this effects peoples life.
It is about common decency, they should at least say your site has no problems that we can see so please be patient. Or the alternative - your website has fundemental problems, we suggest you go back and review your work.
What is the cost of such a policy? IS THERE FLAWS EVEN WITH OFFERING US A PICTURE OF SOME SORT?
Google puts alot of weight on the hand selection process of ODP, right? So hire a bunch of spam-busters, each with a big red button.
"What may be even stranger is that none of the other SEs have chosen to use it against Google."
That's because none of the other search engines are in any better conditition at the moment, MSN is in beta, Yahoo is only indexing a small percent of many sites' pages.
Where do you guys get the faith that google's money makes it easy for them to solve what may very well be a really big technical issue? I'm going to assume that not one of you has ever worked in networking or done real programming. If you had you'd know that some issues are simply extremely difficult to solve. MS has huge resources, here's a small list of things that they have failed in over the last few years, with a R & D budget of about 5 billion per year:
IE security
The new windows filesystem, in development since I think before NT 4, which is when it was supposed to come out.
Theres's many more.
Reading these threads I start to ask myself how many posters here actually pay any attention to anything real going on. For example, all this sandbox talk keeps rotating around spam or adwords, and completely ignores the empirical fact that Google's index was maxed out last year. This fact was self evident on google search home page every time you visited it. But still that simple fact apparently just doesn't register. Nor does the fact that google's index jumped from that number to twice that number register. Are some SEO's simply unable to see facts? What's up with some of you guys, I'm starting to wonder if the whole short attention span thing from growing up with TV and video games is really messing up people's heads.
Then there's the idea that a company trying to maximize it's income is somehow engaged in some 'conspiracy'. This is beyond absurd, all companies do this. If google had not done this, their initial IPO could well have come in at literally 10 BILLION dollars less. Are any of you really so clueless to think that a company won't work to get 10 BILLION more dollars? This level of naivity is simply stunning to me. this isn't addressed to all the posters here, of course, just the ones that seem to need to believe that google is some fairy tale like company that would never act like a real company. Here's a news flash, google is a real company, get used to it.
IMO both Google and Yahoo currently are working with maxed out algos, which thanks to the continued obfuscation present on forums like this, and confusion, they've managed to keep under wraps. Yahoo's situation is completely transparent, but again escapes any major scrutiny. Why would a search engine drop 90% of an existing site's pages overnight? It's because their main index is full, and they aren't doing the sandbox route that google is doing to deal with the issue, they are keeping it fairly random, but actually more fair, than the method Google is using. But from what I can see, both search engines have exactly the same issue, which is why neither one is advertizing their competitor's problems.
The idea of searching for highly useful, unique content on Google seems like a distant memory (or maybe it was a dream).
The excitement came when you could learn about the subject matter by typing something in the little box and selecting from a list of great sites. If you wanted to find a supplier you clicked on a little advert box that was positioned around the organic search engine results.
Now that sounds just about perfect, but now the serps are well mmmm - not exactly what I had in mined. You see I can't call them bad because that would mean I had something to compare my results to. Now if a search engine is filtering 10-20% of results or more in selected searches then I am not really getting the complete picture.
Demographic what was it, oh yeah the capigraphic nature of the Internet seems to be not living up to expectations.
Its just too bad, maybe if search engines worked in the way I had once wished for kids could learn about medicine and maybe grow to become doctors. Selecting factual current information is a good concept so why chage it? Well let me just test something, let me search for 'wonderful places to visit in the world'
Oh my God - Google's number one listing for 'wonderful places to visit in the world' turns up a title that says 'The wonderful wizard of oz'
Brought to you by 'what once was' the best search engine in the world. Now if that aint worth media coverage I don't know what is!
Agreed any knowledge of networking and software coding would give you the correct knowledge and skillset to make informed assumptions of what could be the cause.
And agreed deep and dirty technical problems are usually the result of a wrong bracket or a ; in the wrong place. But we are talking about the basic mechanisms that are for the collection and retrieval of information.
Its quite simple search with -blar -blar -blar returns a clean set of results. Normal results shows ok results - this is not a network problem, or a factor of interoperability or any kinda software glitch.
If they are testing a new piece of code - well its not very new if it produces the same results with our sites in them. (that being the only difference). The truth is we do not know the problem, humans are getting confused as to what the problem is and what we are talking about.
We are talking about results and pin pointing sandbox not about anything else;) - BTW whats a networking and applications coder doing trying to educate the masses, surely you should be applying to Google for a job. Then you could use an alias and give us some insights ;)
If the problem was easily fixed, I think Google would have fixed it by now.
Maybe you could simply forward Powdork's post to your reporter friend. I'm sure he/she would be able to better understand the problem when bcfaz.com is nowhere in the top 1,000 Google results for "breast cancer foundation of arizona".
Powdork said:Even the breast cancer foundation of arizona will get some traffic from Google once people start searching for breast cancer foundation of arizona -asdf -asdf -asdf -asdf -asdf -asdf -asdf -asdf -asdf -asdf -asdf -asdf -asdf
LOL, I did, they were paying I think it was $11 an hour for a job that is worth at least 30-50 an hour in the real world. That's for the 'privilige of working at google'. I think I gave them some suggestions where they could put that job offer, without getting graphical about it.
The main point I'm trying to make is that putting essentially 15% of the entire internet into a sandbox is nothing other than a full on, 100% admission of failure. I don't believe that a group of people like the google mindtrust would do something this incredibly lame, stupid, and pathetic unless they were absolutely forced to do so by circumstances that had mushroomed out of control. In other word, a total hack job, damage control.
I'm not saying I'm a networking / programming guru by the way, I'm just saying a lot of posters here seem to think that having some money means your problems are solved, they aren't. If you had meaningful experience in either field, doesn't need to be guru level, just needs to be experience with systems like that, you'd know it's not always a question of money, some things just don't work right, anyone who's worked with windows networking can tell you all about that type of problem, as can anyone who's done real programming.
MS has not been able to solve major problems ever, with 30 + billion cash reserves, and 5 billion r&d yearly. It's not just the money, if you can't figure out how to solve the problem, you can't figure it out. My guess is that there is some kid in highschool, or first year standfor/mit that has figured it out. And who maybe doesn't feel like working for google since he also knows that google hasn't been able to solve this particular problem. And who likes money as much as sergey and brinn seem to.
What that filter is and how it works is interesting, and there's been some good points brought up, especially suidas' argument, which I think I found a counter example that may or may not disprove it, it seems to.
What is the cost of such a policy? IS THERE FLAWS EVEN WITH OFFERING US A PICTURE OF SOME SORT? <<<
Absolutely spot on!
I cannot accept that a company with Google's resources cannot fix a basic algorithm problem. How many algorithms does the Windows operating system have to control? That's what I call a problem. Not a simple algorithm that only has to make decision on what? 100 factors?
Someone said that Google was now a company and that we had to accept and that is true but, and it's a big BUT, so do they.
Come clean now Google. What's wrong with your search engine? Why are you either refusing to feature up to 20% of the world's websites or why is your search engine not working properly. If I was a shareholder I would want to know the answers to this.