It was only a matter of time...
The data on blocked sites probably helps to confirm or reinforce what the "short click" data already seemed to indicate.
I'm pretty confident that Google has been using "short click" and "long click" data for a long time. Now, since logged-in users get an option to block the site after a short click, that helps to eliminate some of the noisiness in the short click signal.
System: The following 10 messages were spliced on to this thread from: http://www.webmasterworld.com/google/4362332.htm [webmasterworld.com] by tedster - 9:41 am on Sep 15, 2011 (EDT -4)
in the past, they mentioned that they _may_ use "block this site" data in the ranking algorithm, but tucked away in the following blog post, they claim to be using it now:
|We’ve also started incorporating data about sites people have blocked into our general search ranking algorithms to help users find more high quality sites. In the future, we may experiment with data from non-US users’ blocked sites. |
They did already, I think they just expanded it in other .tld
This should create some jobs.
Folks working on Mechanical Trunk will probably benefit...G is good at creating jobs worldwide...
Spam away! I can't imagine it will have a large impact.
|Spam away! I can't imagine it will have a large impact. |
They definitely create more jobs...
2) share on google+
3) block in SERPS
4) jobs for teams that give them all possible user signals, as they are more into directly measuring user behavior.
5) An expanded google quality team
6) More third party manual quality reviewers
Wait I am not sure I am following what this is about ("block this site"), can someone explain this to me?
Here's a thread from June this year where we discussed this feature:
"Block all listing from this site" - message in the SERPs [webmasterworld.com]
It's an option you get when you are signed in, click on a search result and then quickly return to the SERP for another choice. Google (and other search engines, for that matter) have long been differentiating between these "short clicks" and "long clicks", where it seems like the user became engaged on the destination page for a reasonable amount of time.
The blocking option gives them a new layer of information/signal to confirm that the short click was because the destination site seemed crappy.
Note to self: if a competitor is just above me for a keyword - BLOCK THIS SITE.
The damage that will do to his/her website = minimal
The peace of mind of knowing they MAY have done it to you and you've evened things up = priceless.
No, I'm not proud of my point of view, in fact it's rather childish but since that button is pure opinion, my opinion is that I should dominate rankings, I work hard to be the best. Google asked.
>The peace of mind of knowing they MAY have done it to you and you've evened things up = priceless.
That's funny ;-) But since you have to be logged in for it to work, don't you think this could work against you?
Sort of like the "user democracy" involved in the false Permanently Closed tags on some Google Places pages?
Closed, Says Google, but Shops’ Signs Say Open [nytimes.com]
[edited by: tedster at 1:52 pm (utc) on Sep 15, 2011]
[edit reason] Switch link to original source [/edit]
|since you have to be logged in for it to work, don't you think this could work against you? |
I'm sure there's at least some technology in place to filter out competitive blocking. How good it is, and whether the blocker can be hurt by such actions remains to be seen.
Look at it this way; now that AdSense has tanked for many, it gives those old click rings fresh work ;-) It's like a Google jobs program.
Can somebody come up with a way this is NOT a prisoner's dilemma when considering whether to block competitors sites?
Hmm... After some testing I receive the "block this site" text without being logged in. Also did a test to see how long a visitor must stay on a site without having the message displayed back in the serps. For my results anything under 3 min will display the message when returning to google. Also tryed navigating multiple pages on a site and still received the message in the serps if under 3 mins.
That strikes me as much longer than a "short" click and return.
Does anyone know if Google is also factoring in engagement on the Gmail system (e.g. report as SPAM, unopens, short visits) as a block metric for factoring into the SERPs? Yet another form of user democracy.
This is going to be exploited, big time.
Everyone getting the option to block much be logged in to some Google account. That does give them one handle on protecting the system from abuse.
Actually, even if you are not logged in you can get the option to block a site by clicking to the site, then press your back button and the option will be there. You can bet that information is being used.
Sure hope the FTC is taking a VERY close look and what they are trying to do.
I think you have to be logged in, I tried it. This will be another FUD..."maybe your site is penalized because."
This will probably have some weight, assuming anyone wanting to rank on Google will have domains in a year or two: Google plus is outranking everything in a few searches I tried. It's Google plus #1, then 2-3 articles with G+ identities and then a very famous techie site, or the original.
|It's Google plus #1, then 2-3 articles with G+ identities and then a very famous techie site, or the original. |
I haven't seen this yet and hearing it from Brett, you and a few others. But if this is more widespread now, it is the most insulting stuff that Google has done to this world. A less than one year old echoing site getting ranked by the new algos above all others! Why were the other echoing sites like facebook kept under control all these years? Isn't this obvious manipulation by Google? It may be an algorithmic manipulation but it stinks.
if by echoing you mean mashup the BIGGER the mashup the BIGGER the chance the original site is fubar.
Google LOVES mashup, so say the results. They claim to want mashup with 'more' but just mashup is doing beyond fine right now, at least @Google.
I have a couple of examples in my niche of sites getting over 500,000 unique visitors a month from Google without a stitch of original content and with no special features, in fact one of them is a 100% affiliate site. Google's got some serious work to do, fast, which is why I'd expect some splash-over onto non-spam sites getting caught up in a Google clean up.
Freedom for research, to express and to publish, to do everything unless it is a crime.
But they can put filter kind of thing to let people know this website may be spamming or bulk-copying or bulk-publishing content. They are very smart and hopefully might be already doing this. But do they use PageRank system now along-with 'plus'?
I hope they synchronize and relate perfectly well. Webmasters often have complaints about Google-banning, so there should be lesser and least rules to ban the content publishing.
Overall, it is population of real-users who decide what they like to visit, what they like to read and what they think is what is helpful or what is relating to them.
'plus' or 'like' makes good-sense, if it works out well. Intelligent and automated machines can do great, I believe.
Sgt, i don't even want to call them as mashups as they aren't. A mashup is a site that aggregates some good related content (text, videos, maps, photos, etc.) and presents them in a form that is loved by the users. However, I do want them to do it only if they are allowed to aggregate such content by their original owners. otherwise, it is copyright violation in my opinion.
But i have seen several sites that mashup content from google platforms being dumped on their SERPS in the last 1-2 years.
Google+ will only have 1 or 2 lines of text and a thumbnail from the shared page. Unless the user comments on those pages are of real high quality which I doubt, this is a very poor echo of the original copy.Ranking them high because it is on a google platform and hence automatically qualifies to be an authoritative site, is manipulation.
There were several such echoing sites ike digg, stumbleupon, etc which are almost dead on the SERPS, despite being in existence for so many years. But I don't want them to be on the SERPS as they aren't the original sources.
Though the google mouthpieces claim they like only original and unique content on their SERPS, they are choosing to show Google+ posts in SERPS!
I agree that this is manipulation, they already do it with G Books and these can be shown. A lot more is waaaaayyyyy too convenient for them, especially certain high priced keywords. They want to force us to promote their services by using the power of search. In other terms blackmail: use /promote G+ or...
|Google+ will only have 1 or 2 lines of text and a thumbnail from the shared page. Unless the comments on those pages from the fans are of real high quality which I doubt, this is a very poor echo of the original copy.Ranking them high because it is on a google platform and hence automatically qualifies to be an authoritative site, is manipulation. |
However, you'll probably be attacked for using non-Google friendly language, just as is not acceptable to call Panda a penalty. Why we can't say that Panda is a penalty? Simply because Google doesn't say it is a penalty, they call it a promotion of other sites.
I blocked that useless site ehow months ago using this feature, I can't have been the only one. I expect they will have been affected by this?
I blocked eHow too. It does nothing but litter the Internet and waste peoples' time.
The block doesn't actually work, though, since eHow still shows up in my SERPS.
| This 33 message thread spans 2 pages: 33 (  2 ) > > |