|Matt Cutts video on standards for manually removing webspam|
Matt Cutts in this video suggests something of a flexible but nevertheless consistent approach to dealing with web spam....
Does Google use a set standard for manually removing webspam?
Google Webmaster Help - March 4, 2013
As Matt describes it, a spammer's history does enter into it. The spam team is sympathetic to situations a webmaster might have inadvertently gotten into, while being much harder on repeat spammers and on exploits "involving malware, hacked sites, or really malicious stuff".
This seems to confirm comments that I've seen here that say, "This site was penalized just because *I* am the one who owns it."
Well apparently Cutts is talking about manual actions. But if the spam is so blatant and obvious, why didn't the algorithm previously identify it and penalize it, so that there wouldn't need to be a manual review?
Also, why does the algorithm make so many mistakes in which sites it mis-identifies as spam and wrongly demotes. It seems to me that the algorithm is the real problem. Instead of making boring videos, Cutts should start an effort to improve the spam detection parts of the algorithm
Is good that they have organized forms of appeal.
I hope manual reviewers have a good knowledge about what is spam and have a precise set of rules.
I was left with an impression that manual spam actions is ALL he and his team does. All he was talking about (training, shadowing, reviews etc.) are the hallmarks of training a *human* to perform the task, there was not even a hint at doing anything about algo nor the word was even used once (although it was as boring as it ever gets, I might have tuned out inadvertently :) ). Also , not in this video but in his other interactions, he often says things like "I need to check with the algo team" which makes me think that he is not invited to their meetings often and he and the rest of the spam team do not contribute much to the algo. There's more manual work, it seems, than Google would like us to believe.
|Instead of making boring videos, Cutts should start an effort to improve the spam detection parts of the algorithm |
|he often says things like "I need to check with the algo team" which makes me think that he is not invited to their meetings often |
Huh. I'd have assumed it was equivalent to "The computer can't find your file" which simply means "Someone, possibly even me, screwed up but I'm not going to tell you who because part of my job is to cover for them".
Google hires people to look through a series of websites and decide which is quality, spam, malicious, brand, etc. along with various other categories. They can do this on a website to website basis or even on a keywords only basis (if it is a brand name, for example). They hire some Americans, but they also hire many non-Americans to do the job. You can see Google listings with this job description in Indonesia and Thailand especially. They do require "fluent English speakers".
They give the same websites/keywords to a number of people and if they all come up with the same answer then that is how it is treated. If a website is marked as spam, for example, it is labelled as such until a manual review is requested for someone directly from Google to look at.
Sorry if this shouldn't be posted here, but I have been looking for jobs in South East Asia and I run across these all the time, so I assume that it is public information.
I often wonder what fraud risk management Matt has with his webspam team.
There is corruption at the highest levels of business and government.
What management does Matt have in this area?
Is it possible for someone in the team to maliciously remove a site and is there a possibility that some web spam is protected?
@cabbie, I imagine the websites that Matt sees and removes are ones that workers way down in the food chain have submitted as spam. They probably do have the capability to remove sites without notices from workers below them, but I'm sure that would look fishy and be looked at by other co-workers.
|which makes me think that he is not invited to their meetings often and he and the rest of the spam team do not contribute much to the algo. |
|Systems and methods for detecting hidden text and hidden links |
Inventors: Schneider; Fritz (San Francisco, CA), Cutts; Matt (Mountain View, CA)