Welcome to WebmasterWorld Guest from 50.16.78.128

Message Too Old, No Replies

Google Adds "Block Domain" Option To Search Results

   
9:14 pm on Mar 10, 2011 (gmt 0)

5+ Year Member



Google started with the Chrome extension Personal Blocklist to give Chrome users an option to block results in Google Search.

News posted on the Google Blog a few minutes ago reveal that google will roll out block by domain options to Google Search, first in the US, then in other parts of the world.

You’ve probably had the experience where you’ve clicked a result and it wasn’t quite what you were looking for. Many times you’ll head right back to Google. Perhaps the result just wasn’t quite right, but sometimes you may dislike the site in general, whether it’s offensive, #*$!ographic or of generally low quality.

For times like these, you’ll start seeing a new option to block particular domains from your future search results. Now when you click a result and then return to Google, you’ll find a new link next to “Cached” that reads “Block all example.com results.”


[googleblog.blogspot.com...]
2:22 am on Mar 11, 2011 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Another mass data source for Google to chew on. I note that they do not "currently" plan to use this personalized data to inform the general ranking algo, but they certainly will be looking at it to see if it's useful for that kind of thing.
5:54 am on Mar 11, 2011 (gmt 0)

5+ Year Member



They like the Chrome plug in so much, as stated with the Panda/Farmer algo might as well in incorporate into search.

Looks like getting rid of the competition might be a matter of hiring a service with lots of proxies in the future.
6:35 am on Mar 11, 2011 (gmt 0)

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



I would think the use could be more 'what do we need to look for and remove' generalization than direct input ... If it's direct, only the sites (pages) users clicked block for would be affected, and that imo would be considerably less broad (not system wide) and much more easily manipulated than using the data they receive about blocked sites (pages) to verify or develop a new layer to the algo, so I tend to believe them when they say they don't have plans to implement the data directly, because rather than 'site specific' input the data and patterns imo could be used more effectively and with less room for manipulation as a 'guideline' for future changes.

Also, they would put independent 3rd parties in control of their business if they used the blocked site data directly in the algo and I think they're a bit smarter than that ... They have enough trouble keeping links and content straight ... I doubt they'll let the blocked site data directly impact the algo any time in the near future.

Think of it like a spam report people don't have to fill out, my guess is that's close to how they'll be using the data...
3:20 pm on Mar 11, 2011 (gmt 0)



Well, what I don't like in this new plugin is how spammers actually can use it. I believe you'd agree with me that usually most of the users do not block any site or simply do anything besides closing unless it's way too spammy or is not web guru in general.

Spammers actually can start blocking middle level sites using this plugin to artificially increase their rankings anytime Google starts using this data in their rankings.

Human involvement especially in this case can come out to be very disastrous for google itself.
4:01 pm on Mar 11, 2011 (gmt 0)

WebmasterWorld Senior Member leosghost is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Where this might have unforeseen ? consequences ( or maybe the "freshness" element is intentional on the part of the plex ) will be on those sites with "evergreen" content that actually don't update ( or haven't updated ) their content in a long time( because it always ranks well )..so why ..'til now bother ( one doesn't fix what isn't broken ) ..Have some of these myself .

If the searcher sees repeatedly that for the same query ( say they are researching a "how to" or looking for camera reviews on a specific model ) the same site(s) come up at or near the top ..but the content is unchanged .

They may "opt" to remove the site from results ..not because it displeases them ..but they just think "yeah yeah read that know what it says..don't show it to me again" ..but they may actually think the site to be a quality resource on the subject.

Most of us here know how to use advanced "operators" ( when they work ;-) ..but the average surfer may well not realise that there are many possible interpretations of "don't show me this site/page again".

Dilemma .. do you leave what ranks very well ( and always has done ) alone ..or do you tweak it ( and from time to time re-tweak it ) to avoid it appearing stale and getting "read that ..it's good ..but show me something new on the subject ..don't show me that again" ..and maybe getting caught in a future "clean up" in spite of being an authority ?

[edited by: Leosghost at 4:02 pm (utc) on Mar 11, 2011]

4:01 pm on Mar 11, 2011 (gmt 0)

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Spammers actually can start blocking middle level sites using this plugin to artificially increase their rankings anytime Google starts using this data in their rankings.

Well maybe, but I was thinking about how you'd accomplish this, if, say, you wanted to bury someone using this method. In order to become close to statistically noticeable to Google, you'd have to do a *lot* of clicking on that block link, and you'd have to do it in some fashion that the pattern didn't look like a pattern - from all kinds of different IP numbers, locations, times of day, not in bursts, etc. Probably could be done, but it wouldn't be easy, and the minute Google detects it's being manipulated, all they have to do is stop looking at the data.
5:24 pm on Mar 11, 2011 (gmt 0)

WebmasterWorld Senior Member crobb305 is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I wish they would fix one problem at a time (i.e. Panda), before giving another tool to spammers.
5:47 pm on Mar 11, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member




Looks like getting rid of the competition might be a matter of hiring a service with lots of proxies in the future.

1st thing that crossed my mind! :-D
6:04 pm on Mar 11, 2011 (gmt 0)



Dear,
we provide 5000 block site for $4.99.
Plz paypal us at


;)
6:50 pm on Mar 11, 2011 (gmt 0)

5+ Year Member



Netmeg that actually depends on how many users are making use of the feature. If only a few use it or notice it it may not take much at all to flag a site. I'm personally waiting for the first Google blocker application that automates the whole blocking process along with Google account creation and proxy support.
7:15 pm on Mar 11, 2011 (gmt 0)

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



That would be pretty easy for Google to detect. If not many people use it, then it's not providing enough data to get very far on Google's radar. If it looks like it's being misused, they'll toss it. (Might take them a while, but they will) They don't have any reason or interest in maintaining that kind of poison.
7:20 pm on Mar 11, 2011 (gmt 0)



Netmeg, screwing competitors with phony blocks will work like paid links do wonders for your SERPS. Some will get caught but
7:31 pm on Mar 11, 2011 (gmt 0)

5+ Year Member



Not so easy to manipulate if done right. In Google Profiles are other online accounts listed that a users owns. That could help verify a user as a real person. Activity of each user could be used. Only those with a natural activity profile in their Google account (Email, Buzz, FriendConnect displays on web sites, etc.) will be counted when blocking sites. There are many ways to figure out if an account belongs to a real person.
7:37 pm on Mar 11, 2011 (gmt 0)

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



Do you people really think they are going to use this as anything more than a 'spam report' any time in the near future? Why would they do that and allow all the manipulation you all keep talking about to even be involved?

Do you really think they haven't thought of all the things people will try to abuse it if there's any type of direct ranking change except for the individual user based on blocks?

They really can't allow it to have direct 'wide-spread' rankings impact for all the reasons you all suggest for abuse, but they can certainly use it to flag a site for review and write new algos based on the info they receive or verify the direction of new algos during testing.
8:09 pm on Mar 11, 2011 (gmt 0)

WebmasterWorld Senior Member sgt_kickaxe is a WebmasterWorld Top Contributor of All Time 5+ Year Member



They'll eventually consider building an algo that looks for commonality between blocked sites. Of course, I would too. Blocking a site can't directly result in it not appearing to others just yet because as is pointed out above it could be gamed.
8:59 pm on Mar 11, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google explains the logic behind the feature:
"Hey! You don't like your results? Change them yourself! It's not like you're going to use another search engine. We're it, babyface. Why should we work at providing compelling search results? We distribute ads. That's what we get paid for. Could we or Yahoo done this 15 years ago? Heck no, but that was then and this is now. Now is the time for us play with other stuff. Mobile. Video. Whatever. Basic web search is soooo 20th Century."
10:40 pm on Mar 11, 2011 (gmt 0)

WebmasterWorld Senior Member dstiles is a WebmasterWorld Top Contributor of All Time 5+ Year Member



Netmeg - the average (very cheap!) botnet could easily post enough "do not show" reports using a faked UA.
3:16 pm on Mar 12, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I doubt this will get much mainstream use, because it means making people work for better results. However, if I'm wrong it could have some interesting effects. Basically webmasters will have to either provide consistently good pages and ensure the titles are accurate, or split sub-standard content onto lots of different domains to avoid getting blocked. This could be the change that does what the farmer update could not.
9:29 pm on Mar 12, 2011 (gmt 0)

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Netmeg - the average (very cheap!) botnet could easily post enough "do not show" reports using a faked UA.


For a while maybe. But Google would pick it up. If for no other reason than if it worked at first, people would get greedy and start using it to death. They always do.
10:42 pm on Mar 12, 2011 (gmt 0)

WebmasterWorld Senior Member dstiles is a WebmasterWorld Top Contributor of All Time 5+ Year Member



But would they pick it up? They haven't picked up an awful lot of exploit sites, which should be a lot easier to recognise.

From a recent report: "Google tops comparative review of malicious search results -- again"

* 34,627 malware samples found
* 1 in 1000 search results lead to malware
* 1 in 5 search topics lead to malware
* Number 2 Search Term Leading to Malware (term removed)
10:49 pm on Mar 12, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



For a while maybe. But Google would pick it up. If for no other reason than if it worked at first, people would get greedy and start using it to death. They always do.


I can think of many ways to game the system that they will never catch if they start using that data. I will not divulge them however if they do start using that data I will certainly take advantage of it and game the living heck out of it.

I would at least hope as egotistical as Google is, that they would not do something so short sited.
5:52 am on Mar 15, 2011 (gmt 0)

5+ Year Member



would be great if they added some tools which were actually useful, like in a search, would be great to be able to quickly block a particular domain extension from that search (without having to screw around with an advanced search)
10:00 am on Mar 15, 2011 (gmt 0)

5+ Year Member



Excellent, what a useful tool. I for one am completely sick of the results. It has got to the point that I know what im going to get before ive even searched so dont even bother!

The problem is that the same sites show all the time, I know about wiki* and a particular large retail site I dont need them in the results if I wanted them I would have gone direct to their site. The first page is therefor returning low quality results although high quality sites - completely useless.

So I will be blocking a certain wiki site, a large retail site a particular how to site and other low quality results.
7:13 pm on Mar 15, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



A recent Wired.com interview on Panda shows how they could make use of the data...

Cutts: There was an engineer who came up with a rigorous set of questions, everything from: “Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads?” Questions along those lines.

Singhal: And based on that, we basically formed some definition of what could be considered low quality. In addition, we launched the Chrome Site Blocker [allowing users to specify sites they wanted blocked from their search results] earlier, and we didn’t use that data in this change. However, we compared and it was 84 percent overlap [between sites downloaded by the Chrome blocker and downgraded by the update]. So that said that we were in the right direction.

Wired.com: But how do you implement that algorithmically?

Cutts: I think you look for signals that recreate that same intuition, that same experience that you have as an engineer and that users have. Whenever we look at the most blocked sites, it did match our intuition and experience, but the key is, you also have your experience of the sorts of sites that are going to be adding value for users versus not adding value for users.

Source: [wired.com...]
7:35 pm on Mar 15, 2011 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



We have quite a discussion going about that Wired article: Matt Cutts and Amit Singhal Share Insider Detail on Panda (Farm) Update [webmasterworld.com]
4:21 am on Mar 16, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



its just goog building up more of their walled garden
11:45 am on Mar 16, 2011 (gmt 0)




I agree this can and will be gamed, in the same way links and page rank are gamed by hacking sites. Would be pretty easy to do with botnet / hidden iframes.
8:54 pm on Mar 17, 2011 (gmt 0)

10+ Year Member



..got this one sorted, still can't see an option to "unblock" a domain and have read all the docs on the google blog

[edited by: Jessica97 at 9:32 pm (utc) on Mar 17, 2011]

9:20 pm on Mar 17, 2011 (gmt 0)

10+ Year Member



And also, how is one supposed to get the screen to "unblock" a blocked domain?
This 32 message thread spans 2 pages: 32