homepage Welcome to WebmasterWorld Guest from 184.73.104.82
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 105 message thread spans 4 pages: 105 ( [1] 2 3 4 > >     
Filters exist - the Sandbox doesn't. How to build Trust.
Understanding factors that restore and maintain results
Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3110528 posted 4:30 am on Oct 6, 2006 (gmt 0)

A lot of discussions tend to focus on the generality of a "Sandbox", but it has long since been debunked as a useful term, by Matt Cutts and many senior forum members. So i propose the Sandbox is dead :)

What does exist, are filters.

What opposes those filters are good techniques and "trust" - one good member recently referred me to it as "Trustrank".

An understanding of what these main filters are for, how Google applies them and the observed behaviour of Google in releasing them would be a good way for owners to better manage and refine their organic search techniques.

Maybe our good friends in the community could select a topic or several, that they have some solid experience and authority in and support it with a format that can be easily referenced. The most recent one has been largely contributed to by g1smd. Allow me to paraphrase [ and please correct me ] an example of how i think this would flow:

Duplicate Content Filter - incorrect linking

Applied: when internal links are incorrectly applied to "/index.htm" , "/default.htm" when they should all point to "/"

Effect: Unlikely to be indexed, badly suppressed results , PR applied to wrong or duplicate pages.

Time to restore : 2-3 months from when fix is applied

Evidence: WebmasterWorld webmaster reports

Duplicate Content Filter - Meta Data

Applied: when meta descriptions and titles are too similar

Effect: results show supplemental and generally suppressed

Time to restore : A matter of days according to the next few crawls

How many other filters have you observed, what are their effects , what have you done to fix the problem and what have you seen is the time to restore them?

 

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3110528 posted 5:30 am on Oct 6, 2006 (gmt 0)

I like your thinking here. There's nothing for a site to get away from, but there is something it needs to build up to...and then hold on to. That something is TRUST. And as in person-to-person trust, even where it is gained, if it is later betrayed, then trust is not easily recovered to former levels. It's metaphor time!

Another proposed filter: excessive use of a keyword in internal anchor text.
Effect: domain is depressed on searches for that keyword.
Time to restore: soon after the next few crawls (the first time). But if there is a second "offense", then only a more gradual, stepwise recovery.

(This is highly speculative on my part, and based on just a few cases. Could be more than a little "post hoc ergo propter hoc" error folded in!)

martinibuster

WebmasterWorld Administrator martinibuster us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 3110528 posted 7:13 am on Oct 6, 2006 (gmt 0)

but there is something it needs to build up to...

I was thinking about exactly that earlier tonight. Something a Googler said or posted somewhere, along the lines of (and I'm paraphrasing from memory), "go for the longtail style less ambitious phrases. Don't try to hit a home run with a new site. Take the little steps, work around the edges with it until you're an authority."

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3110528 posted 7:30 am on Oct 6, 2006 (gmt 0)

Ouch - Tedster

excessive use of a keyword in internal anchor text

Does that include navigation items? - I'll sticky you an example

:) or :( ?

plasma

10+ Year Member



 
Msg#: 3110528 posted 8:28 am on Oct 6, 2006 (gmt 0)

excessive use of a keyword in internal anchor text.

All of my sites - with usually < 100 pages - have "sitemap on all pages" due to the css pulldown navigation. They all rank fine.

BeeDeeDubbleU

WebmasterWorld Senior Member beedeedubbleu us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3110528 posted 9:22 am on Oct 6, 2006 (gmt 0)

I have a site that ranks at number one for an industry recognised four letter acronym. I use hundreds of internal links with this acronym and other KWs (ABCD info, ABCD links, ABCD articles, ABCD services, etc.) and it does not seem to have done me any harm.

[edited by: BeeDeeDubbleU at 9:47 am (utc) on Oct. 6, 2006]

photopassjapan

5+ Year Member



 
Msg#: 3110528 posted 10:31 am on Oct 6, 2006 (gmt 0)

How about the semi-legend of the
"Adding too many pages at once" filter?
I read at least two threads of this lately...

Applied: Adding a number of pages between two crawls that exceed the number of those already indexed for the domain... OR adding more than "n" ( about 999+ ) at once. May be triggered by additional parameters like... do those pages have affiliate links or ads on them? Are they relevant to the rest of the site? If "Yes" and "No" then this is probably it.

Effect: Entire site except index is held back from SERPs for an indefinite length. Not deindexed but a huge drop in the SERPs ( or they go to detenti...er... supplemental ). Probably to check if the pages were added only to be instantly monetized on?

Time to restore: You can't force this from how i see it. You'll need to stay at it and pay more attention. May come back within three months if the pages were legit. Getting rid of them will probably take three months also... at the least. Uploading a new sitemap without them, deleting and 404ing all and watching them go "historic" supplemental with your fingers crossed would probably help... i guess.

Evidence: Isn't this a synonym for the "sandbox effect"? Some posts here are definately talking about it. I mean the sudden large update factor. Applied to new sites and old sites alike, if you update in such huge bulks, you can expect this to happen. A world famous blogger also mentioned something like this although that post is a blur to me ;P

Prevention: Choose to update gradually. Not in bulks.

... is this right? :-)
Or am i making it up? O.o

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3110528 posted 12:06 pm on Oct 6, 2006 (gmt 0)

BeeDeeDubbleU
I use hundreds of internal links with this acronym and other KWs

Approx how many max on any one page? and roughly the page size?

SEOPTI

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3110528 posted 12:17 pm on Oct 6, 2006 (gmt 0)

Next possible filter: excessive use of keywords in folder names
Example: /great-cool-best-mega/thatsmine.html

oldpro

5+ Year Member



 
Msg#: 3110528 posted 12:54 pm on Oct 6, 2006 (gmt 0)

What opposes those filters are good techniques and "trust" - one good member recently referred me to it as "Trustrank".

Apparently so...

I have created a few websites over the past few months and all have been indexed with very good SERPs. I simply add a temporary link on my established 12 year old website and the new website is indexed within a few days.

Whatever this sandbox is ( if it even exists) it is obviously reserved for sites that trip other unknown filters as a result of dubious SEO tactics. I think the way to avoid it is to just to not worry about SEO, compose content naturally and have one inbound link from a trusted site.

[edited by: tedster at 11:35 pm (utc) on Feb. 17, 2007]
[edit reason] fix quote box [/edit]

idolw

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3110528 posted 1:02 pm on Oct 6, 2006 (gmt 0)

Duplicate Content Filter - Meta Data

Applied: when meta descriptions and titles are too similar

Effect: results show supplemental and generally suppressed

Time to restore : A matter of days according to the next few crawls

yes, this one i see on one of my websites. the site ranks decently, though.

trillianjedi

WebmasterWorld Senior Member trillianjedi us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3110528 posted 1:13 pm on Oct 6, 2006 (gmt 0)

as in person-to-person trust, even where it is gained, if it is later betrayed, then trust is not easily recovered to former levels.

It's my experience that a "trusted" site can get away with far more than an "untrusted" site, all else being about equal.

Which makes me believe that, once that trust has been obtained, one has to worry a lot less about canonical, duplicate content, keyword stuffing etc issues.

TJ

TheRealTerry

10+ Year Member



 
Msg#: 3110528 posted 1:29 pm on Oct 6, 2006 (gmt 0)

In my experience an older domain with a long history in the SERPs gets away with murder. A new site has to prove itself for a while before it gets any Google love, unless it takes off like a rocket with a bunch of relevant inbound links and other things that a general PR buzz can generate. I don't care what it's called, sandbox, litter box, or what, bottom line seems to be crusty old domains get love, new ones get ignored until the popular kids start talking about it.

ulysee

10+ Year Member



 
Msg#: 3110528 posted 1:36 pm on Oct 6, 2006 (gmt 0)

Blackhats new about "trust" long before anyone else.
Blackhats backordered expiring trusted domains and put them to work so they could redirect and dominate serps.

Whitehats now have to kiss the hand of God (Google) and bow down and pray for good serp positions.

You could be a webmaster who knows nothing of seo and have a clean site and have it filtered out by Google.

You could be a player in one industry and obtain a bunch of link backs in no time for traffic and have your site filtered.

You could be a pearly white seo and go through hell for Google and you still can be filtered.

A "unnaturally" filtered web is not worth my time and effort but maybe for some of you it is.

BeeDeeDubbleU

WebmasterWorld Senior Member beedeedubbleu us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3110528 posted 1:45 pm on Oct 6, 2006 (gmt 0)

Approx how many max on any one page? and roughly the page size?

On my site map page I have about 60 links that include the acronym and about 25 on my home page. Page sizes range from 11Kb to 56 Kb and there are about 100 pages on the site.

It's my experience that a "trusted" site can get away with far more than an "untrusted" site, all else being about equal.

and

In my experience an older domain with a long history in the SERPs gets away with murder.

I tend to agree with this. The site I quote has been around for 5 years now and I believe that is is seen as a trusted site. It has lots of links (relatively speaking) from other authority sites in the same niche.

I think that if I were to launch a new site now using the same SEO methods I may have problems.

rj87uk

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3110528 posted 2:00 pm on Oct 6, 2006 (gmt 0)

It's my experience that a "trusted" site can get away with far more than an "untrusted" site, all else being about equal.

I will also chip in here and confirm this. A trusted website can get away with bad code, over optimisation, bad titles among other things.

So really one big thing here to think about is how to become trusted, slowly. We should no longer optimise new websites, build for the user in mind only and then when you have your brand name you should tweak pages to rank better.

Using this thinking then it should provide better quality websites. Makes sense to me....

lexipixel

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3110528 posted 3:21 pm on Oct 6, 2006 (gmt 0)

Re: "filters".... I'll name mine;

The Pagliacci Filter: this filter is applied when you spend more than you did on content development or site design to somebody who claims they will "Get you to #1 (on Google)" after the site is built, it never works and you feel like a sad clown.

The Issac Newton Filter: this filter is applied when it is detected that you are attempting to build an eliptical network of backlinks rather than a circular one.

The Circular Filter: (depricates the Galileo Filter), similar to the above, ('Newton filter'), but applied when it's detected that you were part of an old-school 'ring' and beleived you were the center of the ring.

The Rube Filter:, (also known as the Goldberg Filter), this filter is applied when you have more inbound links, paid directory listings, flash intro pages, pop-up windows, and just about everything else connected to a your site -- but your site is only one page, and has nothing to do with your domain name, title, or meta tags.

The Anti-Knuth Filter: this filter is applied when a site is found to take itself too seriously. Causes may include; no humor in content, eComm sites where every price ends in ".99" just because that's the way retailers price things, or site design built on other standard programming practices and traditional thinking methodologies.

The Closed Gates Filter: this filter is applied when your site is designed, served and supported entirely upon software originating from the city of Redmond, CA.

greenleaves

10+ Year Member



 
Msg#: 3110528 posted 3:55 pm on Oct 6, 2006 (gmt 0)

It's my experience that a "trusted" site can get away with far more than an "untrusted" site, all else being about equal.

I dito that.

Also,

Getting too Many Links at Once

How: Getting a large number of links in a short period of time. I believe this would be agravated if the links are not too relevant, from low quality site or sites that are interconected.

Effect: Site looses rankings.

Evidence: WebmasterWorld Webmaster Reports

fishfinger

5+ Year Member



 
Msg#: 3110528 posted 4:09 pm on Oct 6, 2006 (gmt 0)

Whatever this sandbox is ( if it even exists) it is obviously reserved for sites that trip other unknown filters as a result of dubious SEO tactics.

Disagree, strongly. 'Sandbox' is about trust via links. It applies on a page by page basis. Seen and studied it across 70+ new sites over the past 2 years.

What people call the sandbox effect is just an extension of how Google normally puts pages into its primary and secondary indexes.

If you need links to get into the SERPS for a term (primary index), you won't get in that index until Google sees enough decent trusted links. An interesting factor that also comes into play here is the way that Google returns pages for three or four word terms if one of those words is highly competitive. If there are enough sites in the primary index (which relies heavily on links) then you won't feature in those results even if your title tag and on page SEO is 100% on that specific phrase. Drill down the SERPS to the point where you start seeing irrelevant sites and you won't see your page. It just doesn't meet the criteria to get in. Not enough trusted links.

Best example :

I launched a site for a client in the debt industry about a year ago. He has pages about different problems and the various solutions. One page was about bailiffs. After I had optimised the site that page ranked top three for 2 or 3 terms within weeks. He got so many calls about bailiffs he told me to back it off - people being pursued by bailiffs weren't his target market (no money!) and he didn't want that sort of business. That's why it was easy to rank - no-one else wanted those terms so it was secondary index and on page SEO was enough.

All his other pages feature(d) the word 'debt' somewhere in the title tag or in the page. Even if I searched for very long tail terms I never saw them - because all of those results were pulled from the primary index.

If you can get links from truly trusted authority sites then you can shorten or skip this 'probationary period'. I have an idea that the length of time 'probation' applies is related to the competition for those terms.

walkman



 
Msg#: 3110528 posted 4:11 pm on Oct 6, 2006 (gmt 0)

I think we're playing with words here. I always looked at the "sandbox" as lack of trust. Google thinks you're cheating with links, or have "spam" and you're toast. All your trust/link "juice" is removed so anyone can rank higher than you.

crobb305

WebmasterWorld Senior Member crobb305 us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3110528 posted 4:19 pm on Oct 6, 2006 (gmt 0)

I think we're playing with words here.

I agree. Arguing semantics. Comes down to "Building" trust or "lacking" trust, and the [variable] period of time where a new site is being monitored for either has been dubbed "sandbox". It's just a word. Some sites gain trust quickly, some sites take longer.

My feeling about link quality in building trust: Link chasing is not the way to go. I think Google could easily implement a link "duplication" filter just as they would a "content duplication" filter. It certainly seems unnatural to me if two similar, yet completely independent, sites both have 2,000 backlinks to the homepage, all of which are the exact same links. Of course, footer, header, and run-of-site links also raise red flags. But just don't obsess over a top-ranking site's backlinks and chase them. It may not get you where they are. Just a thought.

lexipixel

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3110528 posted 4:32 pm on Oct 6, 2006 (gmt 0)

One major factor I've found with the "trust issue" is "host trust" and "server trust".

I see sites on cheap hosting solutions that never rank even though everything else about the site is "trustworthy", and then put up a site on a high-end, ("trusted") hosting company's server and it gets indexed and ranks well in a week.

(BAD) factors may be: poor server directory structure (e.g. /xyz/aaa/h234/0/1/2/3/zzz/etc/ above root of site), slow pipe, shared IP address, server down-time, old unpatched server software, etc..

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3110528 posted 5:08 pm on Oct 6, 2006 (gmt 0)

Getting too Many Links at Once

Let's talk about this by using some more common names -- automated blog spamming, forum spamming, link farming. I doubt that even a team of link monkeys could trip this filter with real, humanly acquired links. This is especially true if they were at all trained on evaluating the relevance and quality of linking sites (as any good link monkey would be, right?)

In contrast, there is a situation where a website gets major media attention (mentioned in the NY TImes, for example) and there's a spontaneous explosion of new links. This kind of link growth has its own footprint that is quite different from the kind of forced growth that other types of link growth have -- and it will not cause trouble. Quite the opposite, it can be a significant help.

I also think there is a "link aging" filter where the effect of some marginally trusted links is only allowed to influence search results gradually. The linking sites themselves are going through a trust evaluation.

texasville

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3110528 posted 5:21 pm on Oct 6, 2006 (gmt 0)

"Filters exist - the Sandbox doesn't. How to build Trust."

I am not sure what to think about the title to this thread. Basically, if you get a good, solid, large collection of authoritative, trust-ranked links..then it pretty much doesn't matter about anything else on your site. Matt Cutt's own site proves that. But for the normal site (read non-news) you can't get these links too quickly or you trip another filter. Webmasters termed this google phenomena "sandbox" because it seemed to fit.
For the past couple of years it seems there is always someone wanting to say it doesn't exist or the term shouldn't be used. For those that don't remember, google engineers claimed they didn't design for it but became aware of it from the term being used by webmasters in forums like this. Google liked it. But had no official status for it. Wouldn't confirm it. Always reminded me of the old cold war iron curtain philosophy.
Call it what you will, it does exist. Links have to be collected slowly for new sites. Trust has to be built. The term "sandbox" fits. Now for established sites that have flaws in their sites that get knocked down in updates or "refreshes", yeah..they tripped a filter.
But let's not deny the term "sandbox".
"A nod is as good as a wink to a blind horse."

BigDave

WebmasterWorld Senior Member bigdave us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3110528 posted 5:43 pm on Oct 6, 2006 (gmt 0)

I think that the "too many links at once" filter probably works a bit differently than many think.

Think "trust" in relation to the link source.

If you get a lot of links from trusted sources all at once, you will be considered news. If all your links are from sites tagged as link sellers or link traders, it will be flagged as spam at the worst, and slow credit at best.

There is lots of in between ground when you get a variety of trust in your links.

Subjects that are hot on the political blogs will bring you a lot of links right now. Some from "trusted" news sources, some from bigger name blogs, some from fora, and some from personal blogs.

While there will be a lot of less trusted links, those extremely trusted links should trigger them to consider your spike a valid and important one. The trusted links going into your page will raise the trust in those other links.

walkman



 
Msg#: 3110528 posted 6:10 pm on Oct 6, 2006 (gmt 0)

BigDave,
i agree, but it's risky to go and buy them. if you get sitewides from 10 blogs it seems orchestrated a bit, whereas a link from a post from 125 blogs is natural.

BigDave

WebmasterWorld Senior Member bigdave us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3110528 posted 6:34 pm on Oct 6, 2006 (gmt 0)

Of course it's risky to buy them, unless you can buy them from very trusted blogs that do not generally sell links.

What might be an interesting experiment is to get a page that gets a lot of reputable news linkage. During that linking spike, buy a dozen links.

Having the links appear at the same time as the news links might dampen any filtering that is done. Then see if those bought links are still effective after the blog links have moved on to the 3rd page.

Of course, that means you have to come up with something worth getting a blog flood, then have the time to buy links while dealing with all the new traffic and readers.

annej

WebmasterWorld Senior Member annej us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3110528 posted 7:56 pm on Oct 6, 2006 (gmt 0)

poor server directory structure

This is a new possibility to me. I'd be interested to hear more from others about this. Could cheap servers really be hurting some sites?

potentialgeek

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3110528 posted 8:42 pm on Oct 6, 2006 (gmt 0)

I disagree with the idea that matching titles with meta descriptions (and keywords) is detrimental. My experience is the opposite and I think it's logical for the Google algo to respect the consistency. I always coordinate the title, description, and keywords. I think I learned this back when I first learned html; I believe it was one of the basic instructions from html tutorials. I often have site indices with long descriptive text links which are identical to the page title of the page the links lead to and get top 10 or even top 5 SERP in Google. Is there any reason why Google would see a page with matching title, description, and keywords as bogus?

I agree that high PR'd sites can get away with murder, so to speak. One site higher than mine for a primary keyword has a site index with about twenty different two-word phrases, one word of which is the same in EACH link!

Example:

Alpha Widgets
Beta Wdgets
Gamma Widgets
. . .

Looks like awful spam but they get away with it and ironically it may even help a high PR'd site?!

p/g

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3110528 posted 9:21 pm on Oct 6, 2006 (gmt 0)

Disagree, strongly. 'Sandbox' is about trust via links. It applies on a page by page basis. Seen and studied it across 70+ new sites over the past 2 years.

Sandbox is a loose word created to cover all the unknowns relating to many types of filters. People have applied different experiences and effects under one definition. We like the name "Sandbox" like "BigDaddy" or "Florida Update" but the Sandbox is not an event. It seems to cover everything that stops our results from showing under different circumstances.

Originally, it was intended to describe sites that didn't rank in the results on release - now it seems to cover every type of filter.

Matt Cutts reportedly in answer to a question from Brett Tabke - [webmasterworld.com...]

So i think it's best to persist with saying it's a series of filters.

And how we break those filters is by building trust.

Trust is a series of practices that satisfy the algorithms, and the algorithms are satisfied by technical structures, content and genuine viewer interactions that get validated positively by Google.

Google's job is to measure these things objectively in natural search to come up with the best scenario for a result.

Because such a lot is unknown or unshared, and because people are competitive a lot of "shady" practices occur by default. There are degrees of "shadiness" or "grey" between "black" and "white" which is why the need to organise some facts in the areas of "grey" is a good thing.

Not only this, but some of Google's technology, at times, does not work properly or is prone to error [ nothing new in this - we all make errors ]. By identifying what should and should not be happening it will be easier to address with the WebmasterCentral "re inclusion" team.

It applies on a page by page basis

Filters can exist page by page [ e.g. meta titles and description / duplicate content ] or site wide. Filters can effect networks of sites [ ie bad linking leading to lowered PR and KW transfer in anchor text ]

We speculate maybe as many as 120 or so filters [ according to some patent i read that was reported ] but who knows - let's just say "lots"

What we need to do is break down those filters into known items and have them evidenced IMHO - otherwise we're going round in circles [ which of course may need to continue :) ]

[edited by: Whitey at 9:31 pm (utc) on Oct. 6, 2006]

This 105 message thread spans 4 pages: 105 ( [1] 2 3 4 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved