homepage Welcome to WebmasterWorld Guest from 54.205.122.62
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 105 message thread spans 4 pages: < < 105 ( 1 2 3 [4]     
Filters exist - the Sandbox doesn't. How to build Trust.
Understanding factors that restore and maintain results
Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3110528 posted 4:30 am on Oct 6, 2006 (gmt 0)

A lot of discussions tend to focus on the generality of a "Sandbox", but it has long since been debunked as a useful term, by Matt Cutts and many senior forum members. So i propose the Sandbox is dead :)

What does exist, are filters.

What opposes those filters are good techniques and "trust" - one good member recently referred me to it as "Trustrank".

An understanding of what these main filters are for, how Google applies them and the observed behaviour of Google in releasing them would be a good way for owners to better manage and refine their organic search techniques.

Maybe our good friends in the community could select a topic or several, that they have some solid experience and authority in and support it with a format that can be easily referenced. The most recent one has been largely contributed to by g1smd. Allow me to paraphrase [ and please correct me ] an example of how i think this would flow:

Duplicate Content Filter - incorrect linking

Applied: when internal links are incorrectly applied to "/index.htm" , "/default.htm" when they should all point to "/"

Effect: Unlikely to be indexed, badly suppressed results , PR applied to wrong or duplicate pages.

Time to restore : 2-3 months from when fix is applied

Evidence: WebmasterWorld webmaster reports

Duplicate Content Filter - Meta Data

Applied: when meta descriptions and titles are too similar

Effect: results show supplemental and generally suppressed

Time to restore : A matter of days according to the next few crawls

How many other filters have you observed, what are their effects , what have you done to fix the problem and what have you seen is the time to restore them?

 

RichTC

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3110528 posted 12:32 am on Oct 11, 2006 (gmt 0)

Ref demand for adwords:-

Lets say a commercial site about widgets has sectors red, white and blue widgets etc.

In addition to "widgets" being a highly sought after keyword, ie loads of adwords it might find that "Red widget", "White Widget" and "Blue Widget" are also high adwords it may even find that "Blue Pink special widgets" also attracts loads of pay pr click interest.

In this situation i find that obviously its going to take a good period to rank for the prime kewords, but the semi prime ones are still going to take ages to rank and need lots of aged backlinks.

Even if the site is an authority on the widgets and has been for a few years but adds a new sector page about "blue widgets" ie a semi prime keyword related but not covered befors some sort of filter kicks in to ensure that the new sector pages wont rush in top ten untill some age process has been involved.

If you look at the serps in high adword areas you will find that the top ten sites listed will have had that page in google for some time. IE 18 mths plus.

Also, since the new google infastructure rolled out you dont see new sites spring up with high page rank even if they are well connected, like PR7 for example. I would say that for a site to get PR6 upwards now it needs aged backlinks on its side

RichTC

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3110528 posted 12:38 am on Oct 11, 2006 (gmt 0)

Ref - dupe content issue

Dont think so, its only more relevent in commercial areas imo. The affiliate adverts were sector specific all different but did not effect the rest of the content on the pages involved.

When the % of pages with affiliate adverts was reduced the positions were recovering - i can only go by what i noticed.

Lorel

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3110528 posted 4:12 am on Oct 11, 2006 (gmt 0)

Mcskoufis - I manage a site suffering the same as yours--I've had several SEO people check the site out and can find nothing wrong. We are on the verge of trashing the domain because we can't figure out's whats wrong. No matter what we do the rank (currently below sea lever) never improves.

It's been obvious for about a year that just because these 302's don't show up in inurl: command in Google anymore doesn't mean the 302s have disappeared. Google just removed them from view.

I've been watching (and reporting) these sub domain spammers for the last month or so. I usually find them when searching for the domain plus a major keyword that site should be ranking for--nothing but scrapers ranking where the affected site should be.

I've checked the links in these sub domain scrapers and haven't found a link with a 302 on it yet.

Can someone explain how these spammers are 302ing sites and we can't see it?

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3110528 posted 10:38 am on Oct 11, 2006 (gmt 0)

>> doesn't mean the 302s have disappeared. Google just removed them from view. <<

Yes, I saw a lot of that just a few months ago. The problem still exists. The problem still affects the site. Google SERPs hide the evidence. Haven't been looking for it recently.

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3110528 posted 7:32 am on Oct 12, 2006 (gmt 0)

Re the "priority" referred to by myself on StarryEyed's post and my request for comments - i ended out researching it myself. This is what Google says:

Also, please note that assigning a high priority to all of the URLs on your site will not help you. Since the priority is relative, it is only used to select between URLs on your site; the priority of your pages will not be compared to the priority of pages on other sites.

[google.com...]

I think we can discount this as being a factor influencing accelerated crawling, results improvement or otherwise, for all or any pages that may otherwise be stuck in limbo or a filter.

[edited by: Whitey at 7:36 am (utc) on Oct. 12, 2006]

Reilly

5+ Year Member



 
Msg#: 3110528 posted 7:38 am on Oct 12, 2006 (gmt 0)

guys, i use <an online tool> tool to check dupe content

it must be under 65% then you are at safe, that's my experience

[edited by: tedster at 3:49 pm (utc) on Oct. 12, 2006]

Incorporated

5+ Year Member



 
Msg#: 3110528 posted 10:00 pm on Oct 12, 2006 (gmt 0)

We are experiencing a strange Google behaviour for our italian site. This is a poetry site which runs by 1 year and is based on a CMS. Authors add about 50 pages/week. After 1 year only the home is really indexed and with the main keyword the home is in 4 position in Google (so we are satisfied).

The problem is in the 1500+ content pages.

Some pages are also PR4 (the home is PR5) but pages doesn't appears in Google results (even if I put as keyword the title exactly).

All seems regular except the meta tag Description and Keywords, always the same for these 1500+ content pages.

Why my entire site (except the home) pratically is not well indexed in Google?

Thanks,

V.B.
Rome, IT

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3110528 posted 10:17 pm on Oct 12, 2006 (gmt 0)

You answered you own question. :-)

Make the title and the meta description different on every page. Make them describe exactly what can be found on that page. If that is all that is wrong, you'll see improvements within a few weeks.

nuclei

10+ Year Member



 
Msg#: 3110528 posted 8:27 am on Oct 13, 2006 (gmt 0)

g1smd: Maybe he did, but I would lean more towards the CMS imposed URL's prolly having more to do with it than any metatags.

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3110528 posted 1:36 am on Oct 16, 2006 (gmt 0)

Has anyone observed a pattern by which Google releases sites back fully into the index ( with normal results ) that would indicate which filters were applied and in which order they were released?

I was wondering if this might better help the process of identifying "filters"

We have been in limbo for a long time and just noticed a few days ago some "weak and uncompetitive KW's" being restored to top positions, which relate to body content only.

Meta Titles and internal and backlinks have not come into play yet. So the rest appears to be still heavily filtered.

I'm not sure if this a normal indexing/results pattern ( since our duplicate content fix of 8 or so weeks ago ), or a partial release from a filter/s

BeeDeeDubbleU

WebmasterWorld Senior Member beedeedubbleu us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3110528 posted 10:09 am on Oct 16, 2006 (gmt 0)

I don't know if this helps or if it has already been covered but I have noticed on a couple of my sites that images tend to get found first. When I see a few images getting found on SB'd sites I thnk it can mean that they will soon be released.

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3110528 posted 10:58 pm on Oct 17, 2006 (gmt 0)

There's an interesting discussion on the Minus 30 / -30 Penalty which in effect is a filter, over here [webmasterworld.com...]

This is where a page with a keyword phrase would ordinarily be in position "x" and now appears approximately 31 places below. Typically, a page phrase which was thought to be well supported and previously ranking at , say No1 , has now dropped to 31.

Several folks also claim to have identified the symptom like this :

Evidence of a -30 [ positions ] penalty

For evidence towards a -30 penalty -
Try an "almost unique" search from your site on Google that returns Less than 30 results. For any such term your site should be always BOTTOM or thereabouts for this Serp?

and many folks speak of factors using "Trust" to break the filters:

One suggestion is:

Factors that most cause this filter to exists

duplicate content caused by bad content management system
thin affilaites
big links campaign
errors in html
keyword anchor spam

Usually, you need at least 3 of these going on big time. I HAVE heard of people recovering, and its by fixing these issues.

If your site is one big affiliate site, and has a massive duplicate anchor site map.

eg red widgets
blue widgets
green widgets
pink widgets
purple widgets
greeny blue widgets
bluey green widgets
widgety widgida widgets

You've tripped a filter

Nippi then goes on to share his experience of a total recovery from the filters :

How to Restore a site from the -30 filter

ref: [webmasterworld.com...]


UK_Web_Guy
Its difficult to quanitify "did nothing" as you can not actually "do nothing"

50 sites may have added links to you, which tipped you back over the not trusted/trusted limit.

then in in the "did little" category

a slight content change may have fixed things.
a slight update may have removed a hidden link orphaned in code you were not aware of.

yes, its being going on for years, I had a site tank for 3 months(+30) 2 years ago, issues then were.

a. added 300 recip links.
b. home page sitemap with 400 links in it, all with one word the same.
c. hiddenish text.

fixed all, added 100 more links, removed the ones where no link back, site recovered.


nippi

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3110528 posted 1:56 am on Oct 18, 2006 (gmt 0)

I've put a lot of time into this problem in the last 2 years, having been hit with it 3 times.

I am of the considered opinion though have no proof it is a breach of trust issue with Google. You turn the SEO up too far, and google thinks your are trying to game it. Some of the things done, may not even have been intentional, and likely the recent problems have come by google turning the trust filter up a notch.

I don't think its a case of having done a particular one thing to trip the filter, its an overall "you've tried to trick google" filter.

I don't believe its a manual penalty, though it may well be, as in all cases my penalties followed large link additions and email requsts(within 6 weeks anyway) and one of these could have been sent to google.

I addressed issues such as

accidental hidden links eg <a href=home.php></a>(in all cases i had these)
anchor text spam.(in all cases i had these, reptition of main keyword with different combinations, eg blue widgets.)
junk links(in all cases I had these, not off topic, just clearly not naturally added as added so many in one go.)
thin affiliates.(only in 1 case, most recent, other times, no affiliates.)
duplciate content(in all cases i had this problem, caused by faulty paging class in my cms)

I fixed the problem, and on the first two occasions, sites recovered in 3 months. 3rd occasion, I am still waiting at 80 days but keep finding more problems i did not previously see, so have not had a clean 3 month period to go by.

I suggest to anyone having problems.

1. validate your home page.
2. view source of home page, and copy code into dreamweaver. Save file, then search for ></a> to find accidental dead links.
3. Run gsitecrawler, its a great duplciate content checker.
4. get rid of your junk(y) links.
5. get rid of your thin affiliates.
6 get rid of your more than 20 x 1 word in anchors on home page.
7. have a look for anything else that you think might be tripping the "google does not trust this site" filter.
8. Don't panic and email abuse to google. it won't help.
9. Don't think your site won;t recover, it will.
10. Don't do nothing and wait for it to recover. it might, but better to clean up your act.

JudgeJeffries

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3110528 posted 9:58 am on Oct 18, 2006 (gmt 0)

It sounds to me as thought almost anything you do to excess will get you nailed. The bad side to this/these 'filter(s)' is that its encouraging a lot of people to build lots of basic websites that are of little use in the hope that age alone will give them a head start in the future.

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3110528 posted 12:39 am on Oct 20, 2006 (gmt 0)

Adam Lasnik on the -30 Penalty Penalty // How site's are filtered and released

[groups.google.com...]

Some good questions Webmasters in this situation or similar situations
can ask themselves:
- Is my site providing unique and compelling content?
- Would most consumers find my site to be more useful than others in
this space?
- Am I abiding by all of Google's Webmaster Guidelines?

If the answer to all of those questions is yes, then it's wise to
doublecheck, then be patient / continue to develop your site. And
optionally file a reinclusion request, if you've fixed issues
associated with previous guideline violations.

If the answer to any of those questions is no, then... well, you have
your work cut out for you :)

One thing stands out to me - this is not one filter. It is a series of scores / penalties that need to be satisfied to release a site through results thresholds.

The old "Sandbox" filter is just the same.

Filters can be applied manually or via an algorithm

Much has been said about this by Matt Cutts in reference to "flagging sites" for manual observation. Typical triggers for this might be:

-Large site launches
-Sudden traffic increases

Site criteria that suddencly changes ie

-Backlink acceleration
-Duplicate content recognition

- plus others

How this gets removed is subject to the above conceptual criteria and guidelines by a Google operator. The chances are that this is a subjective assessment within some written guidelines [ IMO ]

See :

Tedster's comments here:

Adding Value to a Site - Thin and Fat Affiliates [webmasterworld.com]

To my memory, this term began flying around with intensity around June 2005, at the time that the training guidelines document Google was giving to human evaluators was leaked.

and talked about here:

Google Training Document [webmasterworld.com]

The bottom line is [ and to state the obvious ] sites are subjected to filters being triggered by :

-An algorithm
-Manually [ when flagged by an algorithm or possibly a SPAM report ]

Those filters are reversed by :

-An algorithm
-Manually [ via a reinclusion request ] and a compelling representation of why the site should be released.

Statements to Google like:

- We have done everything
- Everyone says my site's fine

will likely not work

If you want to have your reinclusion request stand a better chance, it's important to restrict your communication and build it on facts [ IMO ]

We did x , y , z - be specific

Those facts must relate to the Google guidelines and Adam's comments at the top of this post - IMO

I think this is fairly simple.

The problem is in the lack of knowledge in the detail of the those "fixes" by webmasters and and the difficulty to communicate with Google's quality assurance operators at the "Reinclusion Desk" in a manner which is simple and brief . IMO

This 105 message thread spans 4 pages: < < 105 ( 1 2 3 [4]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved