homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

New way of applying trust rank? Understanding the minus penalties

 8:47 am on Jun 9, 2008 (gmt 0)

Two sites in our network have been behaving strangely recently, and I'm wondering if they signify how Google handles "Trust Rank" where entire sites fall into different minus levels.

Then i wondered, following our experiences, what had happened out there amongst site owners to break out of these levels , or fall into them .

Site 1 re-appeared in position - 50 about 2 weeks ago , for all searches of unique content or targeted keywords, irrespective of what IBL's were applied after almost 3 years of results sitting lower than the -950. IBL's were applied some 4 weeks ago .

Previously all pages responded to the site:mydomain.com command. It had problems with duplicate content , been hacked and had robots.txt applied to it. The site is over 5 years old.

Penalty or promotion and why ?

Site 2 had been doing very well for around 18 months and then someone added a strong IBL pointing to the site's home page. The PR was 3 points higher than currently. Within 10 days the site collapsed on the SERP's , nowhere to be found, produced nil results and several days later rose to -50 to -60. Some very strong pages have gone to -40 . We're currently around 2 weeks into this drop and slight recovery.

This site seems to have lost trust.

I wondered given these two examples how Google might be viewing your "Trust Rank".



 2:55 pm on Jun 9, 2008 (gmt 0)

Is your question whether Google uses different minus values for different types or degrees of lost trust?
I think so, but I don't have the formula.


 10:54 pm on Jun 9, 2008 (gmt 0)

Yes Tedster , it's a whole bunch of questions really and I'm wondering how folks out there have managed it and what they've seen.

Google was much more random in it's application of "penalties" or "trust filters" in the past. A site could go anywhere in the SERP's or indeed not appear. But now it falls [ or rises ] into different levels [ minus penalties ] site wide. And sometimes for keywords from what i can see.

This lowering or rising of trust rank , which I'm seeing, appears to be a recent thing , as recent as the introduction of the "minus penalties".

But it's not as straight forward as this. Behind this there are factors up and down , and I'm wondering what folks have experienced. It would be good for some perspective on this.

What does it take to win back trust or loose it? How long does it take? And is Google assigning trust by " degrees " ie -950 is very bad , -30 is a little bit of trust loss and so on.


 11:09 pm on Jun 9, 2008 (gmt 0)

If you do not know the specific reason for the 'trust loss' how can you ever determine a plan of action to recover it?

All I see is less and less stability for virtually any site in the SERPS and more and more 'fog' coming from our beloved Google in an attempt to explain the instabilities.


 11:13 pm on Jun 9, 2008 (gmt 0)

the specific reason for the 'trust loss'

We'll never know the formula , but we can get some idea of what issues can effect it and to what degree. Folks who are observing the rise of new sites or have watched a site rise and fall might be the key.


 12:38 pm on Jun 10, 2008 (gmt 0)

Some months ago we had a very high value inbound link placed on a W3C multi lingual page, deep in to our website, we saw all sorts of spidering issues for approximately 2 months before and after the link appeared on our backlink totals in GWT.

Prior to being aware of the link we couldn't figure out why our homepage was being devalued, googlebot's activity slowed to a fraction of our normal 40-50k a day pages and pages began to drop out of the index.

Our GWT for this site didn't update PR with in the Crawl stats section for approximately 2 months until recently where it now shows for April the linked page has having the highest PR and May its gone back to the home page, but for an age these results were just waiting to update.

We've done nothing to change the link in any way, just stuck it out and it appears that Google's started to trust the link again as its been on their website now for approximately 4 months.

But for us there was a serious issue with loss of trust, hold tight if you can, it shouldn't happen like this but it appears G's really struggling with high power one ways linking in to a site.


 5:38 pm on Jun 10, 2008 (gmt 0)


was the anchor text of the inbound link relevant to your subpage?
was it relevant to the source page?
did your subpage have proper navigation with proper anchor text back to the homepage/site?
was the trusted site's page ever shut out of its host domain's flow of PageRank ( navigation )?
( get archived, orphaned, moved, etc. )
did the subpage include identical content to the homepage?


[edited by: Miamacs at 5:39 pm (utc) on June 10, 2008]


 10:52 pm on Jun 10, 2008 (gmt 0)

it shouldn't happen like this but it appears G's really struggling with high power one ways linking in to a site

This is a clear case that sites linking to you can cause traffic loss.

Has your traffic recovered ?

Do you believe that Google had a applied this as an intentional filter ?

It really looks as though G set your trust to almost nil , before restoring it and I'm thinking that high PR links can set off a check of other factors which previously had slipped under the G radar.

[edited by: Whitey at 10:55 pm (utc) on June 10, 2008]


 11:46 pm on Jun 10, 2008 (gmt 0)

The -950, -30, etc are over optimization penalties
This is seperate from Trust Scoring

A page can have trust, but be 950d
However, I think the higher the trust the more forgiving Google is on Optimization and breaking the threshold.

Getting a high PR link to a page that had moderate trust might have triggered the -penalty if it caused a spike from the normal PR level of that category


 11:49 pm on Jun 10, 2008 (gmt 0)

Hi all,


Total loss of trust Rank = 950 and your work cut out to put it right coupled with a long period in the bin. Various reasons can cause it and sometimes a reinclude request is the only answer if everything is clean and the sites still in the bin.

small loss of trust or rather a site looking "to good to be true" can = 30+, 60+ etc imo but can be resolved in time.

So i would agree that the penalty level is relevent to the crime. I think in mats blog he wrote somewhere that "most penalties were automated" and a minor penalty after the problem is corrected can come back through within 30 days or so. I might stand corrected on this.

But the point is that with so many factors taken into account within the algo, a genuine webmaster doing everything white hat by the book can still trip a filter

Most of us in this field for some time still cant resist building pages that might be over optimised from the offset, ie with keyword in the title, keyword in the metta, keyword in the url, keyword in the H1, H2 on the page, in italics, in the backlinks etc, etc, etc and google has to say, hang on a moment that page is just to perfect, could be spammy?, the page itself might be OK and the rest of the site OK, so rather than whacking it with a 950 it gets the benefit of the doubt and only a 30+ , sometimes de-optimising the page can help once trust starts being restored.

None of us know the answers here but from talking with other webmasters on this and seeing a range of different sites with different issues over the years my view is that it is automated and in degrees and some kind of automated forumula once all factors are taken into account calculates if any dampening factor should be applied or not



 2:03 am on Jun 11, 2008 (gmt 0)

was the anchor text of the inbound link relevant to your subpage?
was it relevant to the source page?
did your subpage have proper navigation with proper anchor text back to the homepage/site?
was the trusted site's page ever shut out of its host domain's flow of PageRank ( navigation )?
( get archived, orphaned, moved, etc. )
did the subpage include identical content to the homepage?

Anchor text was relevant to the page basically the first four words of my title.

Relevance to the source of the page yes but as a whole my site and their's are in different industries.
Yes all my pages are easily navigable and all link back to homepage we use our logo to link back and a footer link in text.

well the page its self has remained on the site but yes, the link to the source page was a one point placed on to their homepage which has a PR10 as time has gone by the news article clipping has moved of the home page and is now on a level below so the actual source page of the link is now on a tier 3 page rather than a tier 2 page.

no the subpage has unique content from the homepage.


Traffic has recovered basically around the time GWT finally updated itself.

as an intentional filter well I'm still undecided, with the points that Miamacs has asked i can see maybe that the link was a little to optimised for Google with the two sites in completely different industries (even tho the source page was relevant) and the PR of the page linking to ours was probably 4 points higher than mine, now its a 7 so only two points but when it was a tier 2 page who knows what the actual PR was maybe a 9.

i think RichTC has a point with a "too good to be true" link and maybe ours was a lesser filter,



 4:04 am on Jun 11, 2008 (gmt 0)

This is seperate from Trust Scoring

Google seems to be more tolerant of some sites than others. Therefore i think "Trust" scoring is part of it.

Vimes - We also observed an IBL identically placed to a site in the same industry. They had no issues , but we had to ask the webmaster to take it down , which thankfully they have.

RichTC - some good points here to remember and a great way to express it. All Webmasters are tempted to push their sites to another level and for those that do , the risk is an issue.

What are the prime areas to re instate Trust ? What can be reversed and is Google more interested in recent activity being reversed?

What's the quickest way to reverse the loss of trust ?

[edited by: Whitey at 4:04 am (utc) on June 11, 2008]


 4:05 am on Jun 11, 2008 (gmt 0)

But the point is that with so many factors taken into account within the algo, a genuine webmaster doing everything white hat by the book can still trip a filter

Exactly! and pul out his/her hair trying to see what he did wrong. Days in the bin means lost income! :(


 10:54 am on Jun 11, 2008 (gmt 0)

-- Getting a high PR link to a page that had moderate trust might have triggered the -penalty if it caused a spike from the normal PR level of that category --

Sorry if this is a naive question, but why does Google penalise sites for getting links from high PR sites? A layperson would expect the reverse to be true.


 12:03 pm on Jun 11, 2008 (gmt 0)

It can be a liability to take one anecdote and try to generalize. I prefer to leave one time observations as data points, and hold off on evaluation at least until many data points seem to line up.

In the case that Whitey is observing, there's always the chance that a critical factor has not been noticed. Because no, it doesn't make sense that a link from a higher PR website would cause a LOSS of trust and a drop in ranking.

But a sudden "PageRank spike" that's out-of-step with the site's history and the normal statistical deviations for its market, could raise a flag that calls for closer inspection -- and on inspection some other "unnatural" factors might be noticed.

We had a discussion back in 2006 about Natural and Unnatural in the Google Algorithm [webmasterworld.com] - and we detailed some of our gleanings from patents and real world experiences.

There's a whole lot of data that Google can mine for "unnatural" patterns. Any number of scenarios could be in play that would not be obvious to third party research.


 12:38 pm on Jun 11, 2008 (gmt 0)

What I didn't tell anyone is that this same IBL was also pointing to another unrelated site , and everything has been going perfectly. High PR , high rankings and it is a complete one off high quality link for them. The rest of the links are all intended to help the site rank for key terms and not very high PR at all.

I guess any huge IBL would stand out if it is outside the natural pattern, but maybe it is also the combination of things that breaks the trust and put's sites into the minus club bins.

[edited by: Whitey at 12:55 pm (utc) on June 11, 2008]


 5:44 pm on Jun 11, 2008 (gmt 0)

Hi, I am helping out on a site that got penalized heavily last year, it was previously on page 1 of organic search results for almost 12 years. The site got hit by one of those proxy sites (Asian) based, which basically zapped all rankings overnight. The site is indexed, but for all important phrases it is minus 30 or more in results. I've since heard that perhaps Google had solved the problem with proxy sites, but for this one, it's too late I guess. It was a catch 22 with the site, as it was built on "old" technology and used urls that were not descriptive etc, but it did so well in the results for years that it was a "dilemma" to try to risk redesign, when there were so many urls' indexed and natural links from other sites. Now, this is a moot point, and I'm almost finished with the redesign, as at this point, the old pages don't matter much except that there are a lot of links to them, but they're not performing at all in terms of search. It will be interesting when I switch to the new system to see if anything changes. The site's home page still retained their page rank, and it ranks for the name itself, so it looks to me as if the penalty is with all the specific phrases instead of an outright ban. Very frustrating...


 12:01 am on Jun 17, 2008 (gmt 0)


You said these sites were part of a "network". Perhaps that is the reason for the penalty, i.e., the threshold for over optimization will be lower for networked sites.


 12:45 am on Jun 24, 2008 (gmt 0)

It could be an issue, but actually I think Google has some ability in understanding the relative and seperateness of sites from a duplicate content and cross linking perspective.


 7:10 am on Jun 24, 2008 (gmt 0)

Hi Whitey.

I might use a checklist to address and reinstate trust to a website. In order to reinstate trust, you need to first define what it is, to me, trust is the probability that the set of documents in your website are good, well-linked to and cited. Trusted websites tend to link to only other quality trusted websites more times than not, provide sticky content and are well linked-to themselves.

That being said, if I were to follow that definition, I would take the following steps to rebuild:

1. Desalinization of previous strong SEO tactics, if any, slowly and over time. Take a look at menu item repetition with keywords, Title / H1 matching and how naturally written the content is.

2. Technical - Check code for any hacks, injected links, issues on that end. Check for scrapers and canonical issues. Check robots.txt for any issues and while you are at it, run the homepage and other pages through a header check. Run a load time check and make sure the server uptime is high. Check to make sure there are not glaring issues in html code in the pages.

3. Do consider reducing the number of outbound links from the homepage where possible, and consider reducing links in the case where you may run a reciprocal directory of some kind. Again, all of this needs to be done slowly. If you are linked between websites on a network now might be the time to consider "nofollow" on those.

4. Do consider linking out to other quality hubs where it makes sense for the visitor and context of the information.

5. Begin slowly building trusted links to the website. There are a handful of high quality paid directories that I believe pass some trust.

6. Slowly add and improve content for the website. Find authority websites and being looking at ways to develop links back to you from these trusted websites.

7. Link between your own documents from content where it makes sense.

8. If possible, make the hierarchy of the website organized, starting with the widest, most general topic at the home page, and trickling down through 'information silos' - categories - to the longest tail keyword themes deeper in the website. Create a breadcrumbing system so that the longer tail keywords support the next level up, which support the next level up, and so on.

Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved