Forum Moderators: open
That is to say, should one be checking to see if the sites are out of the sandbox regularly or only when they know there is a major Google update? :)
Thanks
Mc
The bottom line is that Google has dictated that all sites in the future will not rank well unless they behave like a normal site would behave and unless they are well considered by the Internet population.
Interesting theory.
But ...
... wait a minute!
How come "normal" sites that have been introduced since February are also missing?
Bottom line, for the large majority of new sites it is going to be a long hard haul to get from the bottom of the SERPS to the top, not like the old days when you could get ranked in a week.
Have a look at [google.co.uk...]
Here's an excerpt ...
Google uses PageRank™ to examine the entire link structure of the web and determine which pages are most important. It then conducts hypertext-matching analysis to determine which pages are relevant to the specific search being conducted. By combining overall importance and query-specific relevance, Google is able to put the most relevant and reliable results first.
Oh yeah? So is Google saying that virtually no sites that have been introduced during the last nine months provide relevant or reliable information?
There is no mention of new sites or new pages being treated differently from those that are established. Isn't Google's mission to deliver SERPs that are all based on their algo and all sites being treated equally?
If this situation is deliberate then, if not actually lying, they are being very economical with the truth.
I think with projects like that the hard part and really time consuming part is expanding fields in all of the places an index may be used - all the reports, temporary files, files that get transferred to external companies, screen layouts where the field is used etc. The more business partners you have who have to change all of their systems, screen layouts and reports to accept a new size field, the more complex the project becomes.
Pimpernel, how do you explain what I said earlier, that even new pages that don't fall within the keyword categories of an existing (well listed) sites are sandboxed.
You will be extremely hard-pressed to find somebody who will agree with you that new pages on old sites are sandboxed at all. I have launched numerous pages on an old site that were ranking very well within days. Those pages had hundreds of completely different "keyword categories".
It's impossible to generalise full stop - what you've expereinced is one thing, what another experiences is totally different.
I've got sandboxing of new pages on old sites.
It's not just specific keywords that Google seem to be using to determine what is sandboxed and what isn't - no one has figured out what they are using yet - hence why this type of thread appears every few weeks and hence why they become so long.
Am I correct in assuming that the sandbox theory is either "in" or "out", according to one's preference? And that pages (or sites - I'm not sure which) - are either in this box-thingy or not?
Well, when I step back, I find that the whole concept of Google's PageRank, its SERPS, its toolbar and everything else is based on "better than" or "not as good as" - real quantifiable measures (whether you agree with the measure or not).
I personally find it rather sad that we're wasting time discussing the sandbox like it was some sort of portcullis - you're in or you're out of the Google Castle.
Do others really believe that the massive matrix calculations that define PR are then going to be adjusted by a coin-toss? In or out? Heads you win, tails you're on page 100?
I've a Masters degree in Mathematics, but I'm finding that I'm turning into a philosopher in this debate, trying to understand what I see, rather than making up "in or out" theories that are quite, quite childish.
We would, I think, be better served by trying to make sense of the contrasting and contrary things we're seeing here, instead of heaping coals onto some vast fire.
The fact that we ARE seeing constrasting and contrary things is what we should be grasping - not that we don't think we ARE seeing them.
----
OK - I've had my rant and I feel better now.
Sorry about that - us Brits tend to keep our emotions bottled up far longer than is good for us...
Why not skip over this and read the next post instead....
DerekH
Page are what is ranked not sites
Old sites cannot be subject to the same part of the algo that he thinks new sites are because the data wasn't kept prior to the algo change.
Links are tracked and aged.
One question on this though, a new page on an old site would still seem to be subject to the algo - no? Yet many people have experience that suggests this is not true.
Likewise the other commonly observed attribute that sites/pages in the so-called SB can still rank well for obscure 3+ word combinations. What does DerekH think about this with his background in higher mathematics?
And one other thought: The so-called anti-spam tactics employed by G are to fight a "problem" they largely invented. The basic premise of the G algo, we are lead to believe, is that links are votes. Well, then webmasters go out and get links. And anchor text in links is important, but the fact is that truly natural anchor text very often is totally unrelated to the keyoword and is thus devalued (we think) by G. So now in response to webmasters actions to get links - G (again, we think) takes action to combat aggressive linking - which causes this problem because of course, new pages and new sites have new links. This has resulted in G becoming unarguably stale.
Likewise the other commonly observed attribute that sites/pages in the so-called SB can still rank well for obscure 3+ word combinations. What does DerekH think about this with his background in higher mathematics?
Well, I'm not sure that this is anything more than the way the reverse or inverse index that looks things up is updated and made current.
After all, in addition to the algorithms that decide results, there is the data that is fed to those algorithms. With some pages on one of my sites indexed yesterday, and some not visited since last February, the spread of currency of the data is massive. Who can say what effect the age of the last visit of one of your competitor's pages has on the weight that page is ascribed.
For a long long time I've seen my sites rank really well for one keyword and not for another, and yet for the pair to beat sites that beat me on both searches.
I don't regard that as anything more than "something" in my site doing well for an obscure combination of keywords, any more than I regard the fact that a site doing well for an obscure search means anything more important than the fact that other sites don't.
My god what a sentence that was!
What I meant is that it's easy to do well in an obscure search. That's what obscure means.
And what I didn't say was that I don't actually have a view one way or the other about the sandbox. Some of my pages have done well, some have been wiped out; but the last thing I think is that it's something quite so black and white.
Anyway - you shouldn't as me to justify my rant <grin> - it was just something I needed to get off my chest...
DerekH
The sand box does not exist.
Google updates roughly every three months.
Im talking deep update.
If you have enough seo in time for the update you move
if not you stay.
I have 100 s of sites between my partner and I .
we have seen this happen many times.
If your site has not moved in 6 months than you havent done enough seo or you are doing it wrong.
If you have time to complain on this board chances are you havent done enough.
THERE IS NO SANDBOX!Google introduced new algorithms in February and these algorithms are tough, tough ant-spam algorithms. They are based on lots of factors like:
How quickly the links were amassed
Quality of links
How quickly pages were increased
Etc etc
When you search for a restaurant by its name and city and google does not return the restaurant's website even though it is indexed, just because the site's links aren't aged to perfection, it doesn't matter what its called. It's a reason to leave Google.
People aren't leaving Google in droves because every other aspect of the search engine is far superior to the competition. But each day the 'tough anti spam algorithms' continue, this aspect becomes more noticeable.
Google updates roughly every three months.bak70 if you think we've had a deep update in the last nine months, you may be in for a shocker soon. At least I hope so.
Im talking deep update.
If you add a new page to an old site (+2 years old), and get hundreds of inbound links (from external sites) to it (for a competitive search term), this new page would not rank well at all for many months because the incoming links have not aged yet, right?
This can only be proved or disproved by real-life results that people have had doing this recently. Anybody experience this problem with a new page on their old site?
Explain to me how the following can be explained by anything else than the SB theory (I really am open to suggestions):
If I move a page from our domain with the new (possibly penalized) name to an older subdomain, it shoots up to #3 position and stays there. The PR of the linking pages are the same (PR6 index page => PR5 subpage => page in question). All outgoing links on the page were kept the same.
The only difference I can see is the newer versus older domain.
Oh, and DerekH, I'll see your M.S. and raise you a Ph.D. ;)
I also started a brand new site and it should be either 1 rst or second page after the next update.
(Its already first page on the msn beta)
I feel the reason why some people do better in Msn is that it is updating more frequently than google right now.
I havent seen a change in the way google updates in about 2 years. Pretty much every three months.
Again some people might not notice these updates because popular terms are dominated by people who just do seo better. This results in the serps looking the same.
I have all kinds of sites so I see these smaller changes when they happen.
As far as restaurants not showing for there specific search.
It will take a few updates but if the site has some incoming links it will show up.
There was a very popular diet pill that didnt show up in the top ten for its name for almost a year.
I took a look at it and noticed the site had little seo work on it.
So it took a while.
[webmasterworld.com...]
I took a look at it and noticed the site had little seo work on it.Thats what makes me think the sandbox is not an effort to reduce spam. These are not all spammy sites being caught up. They are sites with little or no seo, sites with quality seo, sites that are spammy. Their common denominator is that they are new. That and the fact that they don't show up for a unique company name.
Insanity is doing the same thing over and over and expecting a different result.
Think about it if everyone could do it it wouldnt be profitable.Anyone can do it, just not with new domains.
I would say uh oh I need to learn the newest technique or Im done making money with websites.I'm not interested in the newest technique, just the development of quality content. I want this to be for the long run. I have a hard time believing Google wants to reward the newest technique either.
You took it the wrong way.
For me the old techniques still work fine as they have been since I started 3 years ago.
But if what I was doing stopped working than I would look for what the new seo criteria for google is.
You cant make a blanket statement that new domains dont rank well.
(well you can but its your opinion)
New domains in highly competetive areas might not rank well. It just makes sense that it would take longer to rank in a highly competetive area.
But I bet i could get top ten listins all day long for obscure keywords with a brand new domain.
like ny widget dealer
or long island lawn care specialist
lol
2. Google is making use of a Thesaraus like function to display results. This happens when it is unable to find good results under the string typed in. For example(not actual), Indigo Widgets SLC returns results for Blue Widgets Salt Lake City. This indicates a category like classification.
Some conclusions can be drawn from this and other observations:
- There is some kind of "sandbox"
- The sandbox exists at one of the algo layers of Google
- The sandbox is most visible with new sites, but is also visible when adding new pages to a website of "unrelated" keywords
- How to overcome this sandbox is not really known. It seems that the "sandbox" is cumpolsary for most sites (possibly leading sites or newspaper sites are exempt)
What Google dosen't realise in all it's wisdom is that this isn't going to deter SEO. Sure, it causes a strategic shift in the way we SEO, but dosen't stop it. I see action shifting to acquiring of domains categorised under required keywords; and also formation of alliances between websites who rank under desired keywords.
It does appear to a be a negative and undesired step by Google; who is appearing to be more & more overzealous about their overrated threat: seo.
This is about the 20th thread about the sandbox with 100 or more posts. Cleary, webmasters are not happy about it, and clearly Google is not commenting about it. This can only hurt Google in the long run.
That pretty much sums it up.