Forum Moderators: Robert Charlton & goodroi
not sure if you can measure trustrank beyond common sense guessing, but how it works is... human editors help search engines combat search engine spam, but reviewing all content is impractical. TrustRank places a core vote of trust on a seed set of reviewed sites to help search engines identify pages that would be considered useful from pages that would be considered spam. This trust is attenuated to other sites through links from the seed sites.
Thanks for starting this thread. I also ask in the update thread about TrustRank. Maybe we can learn something here. Is TrustRank then just a theoretical idea being discussed as a possible element in the Google Ranking process or is there evidence that this is something they have included or are in the process of including?
bigace
We are a Google News source site and over the last 2 months we have been up and down like a yo-yo.
EG On Aug 28 we went from getting a few good headlines a day to being top story on everything we ran and remaining on sub headlines for days (traffic went up 150,000 uniques per day). Then overnight we went to never appearing on frontpages or even sub pages. Rather back down the list.
I mailed them and they said auto etc. But next day it started again. Then Sept 22 it dived and hasnot recovered since. Now sites that write up stories based on our story end up way above us. Almost like one person switched us up and then someone had a look and switched us back down.
I kinda thought it might relate somehow to serps. But now I think there must be a manual factor.
For instance earlier this year they changed the way results are displayed, from the latest from anyone to sites with more trust...I guess. We did OK out of this as quite a big site.
I also noticed for a few hours last week we went back to top lining.
Also of interest is that the story itself is given the weight. So even after we bombed on Sept 22 the stories from days before still ranked.
I also wonder if they are applying the technique to serps in some way...
Anyway, I would guess that trustrank is based on either an editorial review or a panel of some sort or maybe it is auto from age of site, links etc. My only caveat would be that since Sept 22 we have added new sources, writers, etc. So quality has gone up, yet SERPS and G News positioning at worst position ever.
Fundamentally, it's a concept that is totally unnecessary if other algos are working properly and could get Google into a public relations nightmare if it ever transpired that money was changing hands based upon it.
Having said that, I don't rate Google brainpower as highly as others seem to do so I guess it's a coin toss as to whether it really exists or not.
Incidentally, since we are expected to believe that "Page Rank" is named after Lawrence Page, is there a man called Jedediah Trust working at the Plex these days?
Kaled.
Your reading of this is spot on. Google's idea of ranking web sites according to link popularity was conceived in a vacuum. An academic vacuum. In the real world of a market economy they've had to do some fancy tap dancing to accommodate the basic assumptions of that model.
Academic and noncommercial web sites do generally support the link popularity theory. If a site A is a good one, B will link to A out of courtesy and "pure" intentions. I'm guessing that only a relatively small percentage of those types of sites trade links. One may occasionally read in this forum how webmasters of noncommercial sites are affronted by our crass commercialism.
But the day those first two commercial site owners discovered that trading links would raise SERPs, the Google model broke down. The very act of swapping links to raise SERPs is anathema to the model. They just haven't figured a way to stop it.
Now we have people buying links-- a complete distortion of the Google link popularity model. But didn't I read in this forum a few months back that Google Guy gave a tacit approval to the notion of buying links? Was it because they haven't found a way to combat it yet and don't want to admit it?
I've been reading posts in other places offering that the new Jagger update is the begining of the end for reciprocal links-- specifically because the reciprocal link is a bastardization of the Google link popularity model-- and that the trusted site idea is the way Google will kill reciprocal linking. If that is so, then those webmasters who have managed early to set up their cosy little cross-linking cartels and who are stealthily buying links will stay at the top.
I just don't see how the pure link popularity model can hold in the real world.
Also, it would be very difficult in most instances for google to know if a link was purchased or not. For instance, you can buy your way into a directory or, if you're willing to wait a bit, get a free admission. There's no way for a SE to determine just how it is you came by that "democratic" vote.
"Signals of Quality" is also something that Google engineers have discussed. My guess is that Signals of Quality are part of TrustRank, but I'm only guessing. Signals of quality would probably include things that are mentioned in some of Google's patents, like if a web site is hosted by a "trusted host" or the length of time that a web site's domain name is registered for.
If you have a potential advertiser that is a global company its hard for them to understand why they are not top of the results for there core business. I guess trustrank helps solve that.
But I would like to see these things done in the algo as I think human interference ruins the results. I still personally doen't see any difference between trusted content and high page rank. Except page rank is algo driven.
Sites running Adwords are human reviewed sites which google links to.
Sites in Froogle are human reviewed sites which google includes in its Froogle directory.
As I understand it these inclusions dont confer any rank of any sort; trust or otherwise?
It seems strange google doesnt in some way use this ready pool of human reviewed-ness.
Possibly, but I would actually interpret this as already being in use to some extent. Already I have seen what I can interpret to be some of it's effects, if only to a small degree.
In it's entirety, it could also be feasible under the following conditions:
Theming from authority sites to outbound links.
Loss of trust rank for authority based on a given scale based on outbound links to unrelated, poor choice sites (thus putting a damper on 'purchased' links.) By this fashion, those who purchased links from sites who were in the business of selling many outbounds, would inevitably be purchasing links with less credibility. Also, the number of outbound links per page, the recieving page theme and site theme, along with the original link site would all play factors in the overall trustrank score.
Manual review of those seeds over a given time to evaluate whether they still are authority sites. As any site can turn sour, so can authority sites and so they must also be reviewed. This would include the possibility of detrusting a site or including others into upon periodic review.
Thoughts?
Instead, they acheived top ranks.
no, trustrank is arriving and being implemented as we speak imho...