moved from another location
Now they are looking for user feedback to help filter spam, this could get messy...
[edited by: tedster at 10:18 pm (utc) on Feb 14, 2011]
Does the link to block access appear on all search results or just those that google thinks is a content farm? If it's the latter it may help webmasters as they will find out what googles perception of their site may be
Regardless of what they use it for, if you block about 10 of the spammiest sites out there, the results are really good. Would love to see them make this for Firefox.
As Butler Lampson famously quipped, "All problems in computer science can be solved by another level of indirection."
Google does not want to be in the position of subjectively separating bad content from good content. Solution: Add a level of indirection; have an algorithm do it. (Yes, they control and keep secret the algorithm. Go away -- you're bothering me, kid.)
Google can't refine their algorithm to successfully identify content-farm bad content from good content (without unacceptable collateral damage). Solution: Add a level of indirection; ask users to do it.
|I wonder if it'll work with the Iron browser? It's Chrome though without Google constantly spying on you. |
So you want to try an extension which submits all its data to Google, but choose a browser because it doesn't submit all its data to Google?
As for the extension itself, I wonder what might happen if I search for my keywords and then block my competitors' sites? Then watch what the extension submits as data to Google, and see if it can be resubmitted multiple times via proxies?
This all seems so manufactured to me. Again.
Who has the resources, money and tech staff to game this system (if it can be gamed)?
How many "blocks" from users would it take to bring down a site that has a couple million pages floating in the serps vs. a smaller site of say 1,000 pages?
Bloggers can be a competitive, cliquey bunch. What happens to high profile cooking blogger when the mean girls get orgaized and start clicking "block"?
What megasites float to the top when the "poorer classes" start taking each other out?
There are a few recipe sites that AREN'T SPAM but I don't like their websites or I'm looking for something different...and they're floating in the top 10 for tonnes of queries. I block them so my searching is less cluttered. Should they really be taken out because people like me are tired of that website and want something different?
I do quite a bit of time search for technical phrases related to my industry that tend to pull up not only experts-exchange, but other forums as well, usually with just the original question, and a "bump"..
I'm looking forward to getting rid of a bunch of unnecessary results that ultimately waste my time.
Now if I could only get a way to use chrome as an .exe that i could download from my site as needed, instead of installing it on all the offices I go to, or using a jump drive.
Google never said that the user data will do anything all on its own - and I can't see how it could. User data is going to give them a lot of data points, yes, but they're bound to combine that information with a lot of other factors.
I think the data will be tainted in a big way. We are dealing with opinions here and guess who will have the edge, the big guys. Imagine fans hitting sporting websites and so on.. Opinions.. not quality
This could easily be used to narrow the 'manual review for TOS violations' pool down to something more manageable... N 'blocks' triggers a manual review.
How long before someone comes out with a proper script/program to click the "block" button on toolbar repeatedly through proxies or similar shenanigans? Change the toolbar ID being sent to google, change the proxy IP and click block button on competitors site some 100 times a day. Nice and easy way to take care of SPAM (sites positioned above me).
To me this certainly looks like going down the slippery slope.
Where can I buy dislikes?
|Google never said that the user data will do anything all on its own - and I can't see how it could. User data is going to give them a lot of data points, yes, but they're bound to combine that information with a lot of other factors. |
But I do think that this tool is going to be massively misused to confuse google more than it is now!
Already the JCP issue seems to be a good example for this..Without knowing what had acutally happened behind the scenes of this drama, a manual penalty, if already imposed, is not the way to go in my humble opinion...
I would say that devaluing of links (than imposing a manual penalty) would be the best approach, unless google knows the truth behind the whole issue..
But getting into the truth is not google's business and they might not have to spend their resources on them . To be fair to all, the best remedy might have been to devalue the links and correct any algorithmic loopholes (which google claims to have done already).
[edited by: indyank at 1:43 pm (utc) on Feb 15, 2011]
I wonder if this new Chrome extension will be as effective as the thousands of user submitted spam/malicious content reports that are ignored by Google.
JAB Creations, your question, "Any one else tired of accidentally clicking on links to experts exchange?" is EXACTLY the reason I whooped for joy when I saw that Google was supplying an extension for blocking sites in serps. So far, experts exchange is the only site I've blocked. I'll block others, no doubt, but that site is truly the only one that really ticks me off. So yay!
Wow, that's really, really lame. If anyone besides Google did this, they would be laughed off the web.
|There are a few recipe sites that AREN'T SPAM but I don't like their websites or I'm looking for something different...and they're floating in the top 10 for tonnes of queries. |
With you on that one. Searching for a recipe and all the "recipe search engines bubble to the top" - not helpful. Same to with subscription only content, you're looking for a solution to some technical problem or a news story and keep hitting these subscription only content sites. It's like having a locked door behind an open door and about as useful as a chocolate fire guard!
I dont see google using this as a direct and "manual" factor for removing results or manually overriding an algorithm generated SERP , but more of user feedback to corroborate algorithm based analysis and thus allowing for optimization of parameters used to detect content farms/spam/etc..
Good move if you ask me, but as @johnnie said:
|Where can I buy dislikes? |
3..2..1.. till competitors try to get their peers removed from SERP's using random proxy submitted complaints.
Just a quick point re: experts exchange, if you scroll all the way to the bottom of the page, the solutions are there.
I'm not keen on doing Google's job for them.
I believe the TheMadScientist has it nailed...
"This could easily be used to narrow the 'manual review for TOS violations' pool down to something more manageable... N 'blocks' triggers a manual review."
As long as their is a manual review, I think this could be a positive thing... but very tough to police.
This is the silliest thing I've seen since.. the last silly thing I saw.
Great... so within a week or two, I can probably head over to Fiverr, and for just $5 bucks, get a seller to get 50 people to:
- Search for "this exact phrase"
- Click the link for "this exact page"
- After 15-30 seconds, report it as junk.
This is so wide open for exploitation... I almost cant wait to try it!
Woohoo, gotta love these new SEO tools Google keeps releasing!
This is one of the best possible moves Google could make and the concerns by many are easily negated.
First off if you log what you reject like I do you tend to see patterns like small clusters of similar IP addresses that spammers come from, if someone is going to try to block your domain for say the term umbrellas then the breathe of people doing so will be limited and concentrated to things like IP addresses associated with spammers or not associated with human behavior which captchas are completely unnecessary for.
Secondly it's obvious that Google will spend time going through the results as humans, not simply adding this as an algorithm.
Thirdly what will most likely happen is that the top disliked sites will have the people who black-listsed them have their information analyzed most instead of say trying to sink a hundred thousand top spammy sites.
...and that's just the beginning of it all.
dublinmike, in regards to experts exchange it's a deceitful site that strong-arms people in to registering and participating for answers and I think there's money involved too.
engine, like I said it would be easy to analyze the data to see what sites people dislike the most and from my perspective it would be very easy to look at various things to see who is a spammer, who is a human, what is legitimately disliked versus competitors trying to blacklist each other, etc.
Besides, when a site tries to abuse the system like JC Penny apparently did Google deals with the problem. Since content-farms are clearly trying to abuse the system I'll be happy to see them get removed from results. Links don't just appear in results according to the page alone by itself though also by what Google thinks your site talks about a lot...so when you start spamming tons of junk pages with "answers" and Google notices that many sites that rank high end up having users click on links that rank lower for the same search you know that there's a problem. Google is dealing with the problem and I'm happy about that. :)
I think it's time to stop thinking "One Set Of Search Results" and move onto "Individual Search Results". Once you do that you can stop thinking about undermining your competition, it will only be possible to hide them from yourself with this.
As for how "trusted" links are it will become "how trusted they are to you". In other words YOUR link clicking history will alter YOUR search results.
We're getting close, if not already there. Matt did warn it would be best to work on just one site instead of many, making sure it has traction is more important than ever.
How do I know this? I don't, it's just a hunch, but after watching Google roll out individual targeting on ads it makes sense to extend it to results. Example: If you never click on adult links you'll stop seeing adult links near the top of results but someone who clicks on them all will see more of them in his/her results.
I hope the funnel effect of seeing more of what you know won't detract from finding things you don't know about, yet.... but I'm sure they've worked that out too.
Multiple user households, or public computers, are the ultimate challenge :-)
|If installed, the extension also sends blocked site information to Google, and we will study the resulting feedback and explore using it as a potential ranking signal for our search results. |
You don't do their job. It's sort of their spam report where you can submit and resubmit forever with no end results seen for your efforts. Bottom line - it's another addition to spying on users.
|I'm not keen on doing Google's job for them. |
|Who has the resources, money and tech staff to game this system (if it can be gamed)? |
Mechanical Turk does.
This kind of feedback that is really easy to game concerns me and makes me doubt how big an influence flagging a site will actually have on the algorithm. If is has a large influence it makes it way too easy for me to game the system and have my competitors' sites flagged as spam.
My guess is that at least initially Google will gather this information and analyze it without incorporating it in the algo. Perhaps after they have a large quantity of data and they are convinced there is little chance of widespread negative sentiment arising from rolling it out, you'll see it rolled out to select data centers, and perhaps to the public.
They may also ultimately use it to supplement the manual spam reports...ie. if a site gets a large qty of people flagging it as a % of the total traffic, it may be flagged for manual rather than algorithmic review.
Or...they could just be using it to aggregate data about what types of sites people feel are spammy to extrapolate what they have in common and they adjust the weights of those factors in the algo rather than us the flagging signal in and of itself as a ranking factor.
The bottom line is that if people find your site interesting or useful, you probably don't have much to worry about. If they don't, you probably don't deserve to rank well anyway. Unfortunately there are a lot of sites with little value out there that do rank well and earn their owners $. That's where you 'll hear the loudest screams coming out of this.
| This 53 message thread spans 2 pages: 53 (  2 ) > > |