Forum Moderators: Robert Charlton & goodroi
That is, the site is ranking 200th, an google thinks the 199 sites infront are better
as opposed to
its ranked 200th, and the 199 sites in front are older but not actually better, but simply more trusted by google
However... I'd say some of the sure signs of getting out of the sandbox are:
* quantity of pages indexed
* frequency of crawling
* number of pages ranking for a large variety of keywords
Would love to see others' list, too.
But in this time -- with Google operating several scores of data centers, providing different search results by geo-location, by user agent detection, by logged-in status, by running short tests at various times of day, and who knows what else -- it is more essential than ever to watch actual traffic and to pull the search terms out of the referer. In the midst of today's SERP chaos, no "rank checker" that I know of is really up to the job with any real degree of precision.
The signs that you may be emerging from a filtered condition can first show up here -- in real traffic as recorded in your server logs.
I've noticed that as trust is gained you start to rank for 5,4,3 word searches etc - in that order.
It takes time, and while you wait you're site probably appears filtered for the lesser word phrases.
It also appears to me that Google use a dictionary of competitive phrases, which form part of this "trust" system.
Time, patience and confidence that your site is worthy of ranking highly are all required IMO.
Interested to hear anyone elses thoughts or experiences.
I have a new 65 pages website (launched first in august) that became fully indexed in G within the first 3 months online.
My pages is frequently crawled by Googlebot -- Allmost every day and I see frequent cache update in Google. In november my HP was indexed (and the cache was updated) for about 6 times in one month; on the 2nd, 5th, 14th, 15th, 22nd, and 28th of november.
Also, when I upload new pages, they get indexed in just 3-4 days.
I have several pages that lists in position 1-10 in G's SERPs for a number of quality keywords -- for some keywords I have 1st and 2nd position (yet I still have some work to do on other important quality keywords where my pages seem to rank low in G).
On the other hand... I have approx. 35 pages that has gone supplemental due to similar titles and META-descriptions (I have allready taken action on this issue).
It seems to me that Google did let me out of the 'sandbox' at the end of october, and I do believe that Google begins to trusts my site more and more every day.
But just as vite_rts, I'm also unsure of how to tell when my pages has climbed to it's natural maximum (and when I can begin to SEO my pages for real).
Maybe they allready have reached their natural maximum -- but how can I be sure when they, in some cases, allready rank well?
You can't separate 97 parameters of the 329-parameter G algo, and examine only the rest. Isn't that the whole point of it? :P
I agree that you're asking something interesting, but also agree that you're not phrasing the question well enough.
It may be better to ask:
How do you tell when your site's on-page factors alone ( like relevance ) WOULD indicate that it COULD rank better than the rest of the sites on the SERPs?
Answer is... once you appear on one and two word search result pages, you'll have at least a shot in determining the answer. Until you don't, your site probably doesn't have the proper off-site factors to compete. Let it be pagerank or trust... it's all about... you know...
"Quality and on-topic links to your site." ;)
Or you could view your pages as an average user, and if it looks "natural" ;) it probably has the right amount of everything already. Thinking back to what i've seen recently ( on the SERPs ), if the page is on-topic, has the words in its title, description, and in the beginning of the page, it's all set. The rest comes from whether people link to it or not. Especially with on-topic anchor text. I've seen some top 10 results that didn't have much more than the above, except that they have been referred by many, many others as a useful resource.
...
If you asked whether your page has gained enough off-site relevance just not from trusted sites, that's a different question. Answer is the same. Unless you have a blue trustbar next to the pagerank bar, you won't know. Of course you could guess. And you know what, i think we get it right 90% of the time just by looking at the page of a site that has a link to ours. A guess whether it's trusted or not :P
...
I think the whole point of trust was so that no one could answer your question. And no one could start building the next generation of silver bullet web pages with THE proper keyword density, THE proper amount of code, THE proper layout, number of links... etc. There's no such thing anymore. Trust isn't a "sandbox", it's not a single filter that you have to "pass" before getting to the old Google. It IS Google now.