homepage Welcome to WebmasterWorld Guest from 54.227.41.242
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 125 message thread spans 5 pages: 125 ( [1] 2 3 4 5 > >     
Google says it cannot change results
ScottM




msg:83457
 2:02 am on Apr 7, 2004 (gmt 0)

"Our search results are not manipulated by hand. We're not able to make any manual changes to the results."

So much information...not sure who is telling the truth.

Please, let's keep this to a discussion about Google and not the controversy of the topic.

[news.com.com...]

 

Brett_Tabke




msg:83458
 3:12 pm on Apr 7, 2004 (gmt 0)

"Google's search results are solely determined by computer algorithms that essentially reflect the popular opinion of the Web," he said. "Our search results are not manipulated by hand. We're not able to make any manual changes to the results."

That is as close a public admission that they are powerless to act on people gaming their system as I have ever heard.

BlackRhino




msg:83459
 3:17 pm on Apr 7, 2004 (gmt 0)

That's an interesting commment by Google. I seem to remember, it may be old age, but Google once mentioned, either at a conference or maybe GG, that they will lower a spammer's page positioning in the index artificially to see what tecniques they will use to rank again.

Brett_Tabke




msg:83460
 3:24 pm on Apr 7, 2004 (gmt 0)

So the infamous "pr0 [google.com]" was just a figment of our imagination? Or do they claim that was also machine generated? (which I would not doubt)

sem4u




msg:83461
 3:26 pm on Apr 7, 2004 (gmt 0)

Interesting quotes. So there will be no more hand removals of sites then?

treeline




msg:83462
 3:27 pm on Apr 7, 2004 (gmt 0)

What "manual changes to the results" are depends on the precise context of the statement. It wouldn't necessarily include deleting spammy sites from the index.

idoc




msg:83463
 3:28 pm on Apr 7, 2004 (gmt 0)

"Our search results are not manipulated by hand. We're not able to make any manual changes to the results."

I take that to apply to regular serps. I think they can "pr 0" a site if they want. I don't think they want to get into the business of overtly editorializing the serps. I think that *would* lead to trouble for them.

Marc_P




msg:83464
 3:29 pm on Apr 7, 2004 (gmt 0)

I think they "hand removed" a file sharing program's site from the SERPs not so long ago ...

kaled




msg:83465
 3:33 pm on Apr 7, 2004 (gmt 0)

There are lies, damn lies, statistics and official excuses.

We all know that Google CAN and DOES adjust search results by hand. In addition I believe it also does its best to comply with national laws, in China for instance.

Now, what was the name of that site?... searchqueen perhaps, or was it searchemperor, searchprince, ah yes I know searchk...

Kaled.

Brett_Tabke




msg:83466
 3:33 pm on Apr 7, 2004 (gmt 0)

Removing a site from the serps for legal reasons (zapping the site) is much different than just pushing it down somehow in the rankings. Legally, they have to be able to remove a site to comply with copyright laws. Additionally, once that is done (a site removed), then the fabric of PageRank unravels throughout the sites it linked too. So if a high ranking site were removed, then all those sites it linked too, would also lose a bit of rank (which is what we believe happened in the sk issue)

defanjos




msg:83467
 3:36 pm on Apr 7, 2004 (gmt 0)

What "manual changes to the results" are depends on the precise context of the statement. It wouldn't necessarily include deleting spammy sites from the index.

Agree 100% - the key word is "results"

By "results", they mean the sites that make it to the pool of sites to be displayed to the public - they say nothing about what goes on before that happens.

[edited by: defanjos at 3:39 pm (utc) on April 7, 2004]

Chndru




msg:83468
 3:39 pm on Apr 7, 2004 (gmt 0)

very nice catch, defanjos

Fiver




msg:83469
 3:40 pm on Apr 7, 2004 (gmt 0)

Hrm, my memories of a magazine interview from a while ago quoted google engineers who were watching the data stream of queries and associated cities. The article started by talking about a computer in the front office of the g hq that spits out queries and a geolocation.

So in the article the engineers talk about how they once noticed a query related to suicide coming from the local area. They said they checked the serps to try and make sure the best sites were coming up first for that query, and that they would change them if there was anything suspect. They implied that it may be too late for the person that just searched, but hopefully they could help the next person.

All very touching, but all very telling. Would they only be removing sites in the suicide related serp that have been practicing black hat spam techniques? Or simply making judgment calls?

/would all be much more interesting if I could find that article. It was a number of months ago.

found it. second page [wired.com...]

rogerd




msg:83470
 3:44 pm on Apr 7, 2004 (gmt 0)

It's a bit disingenous to say that Google never manually adjusts the performance of individual sites. I guess the distinction Google would make is between penalizing a site for some kind of real or perceived manipulation (which they do), and manually dropping a site in the rankings because it is less relevant to a keyword than other sites (which they don't do, as far as I know). If they started doing the latter, there'd be no end to the complaints. "My company is #5 when I search for my company name - please fix this."

Yidaki




msg:83471
 4:04 pm on Apr 7, 2004 (gmt 0)

"Our search results are not manipulated by hand. We're not able to make any manual changes to the results."

[asp-cyber.law.harvard.edu...]

Specific Sites Excluded from Google.fr and/or Google.de: User Testing Results
Localized Google Search Result Exclusions:
The authors list below selected sites tested by users of the Google Exclusion Real-Time Testing system and found to yield different results at google.fr (France) or google.de (Germany), as compared with the ordinary google.com site.

<thesiteinquestion>.com - divergence: google.com indexes 265 pages, while google.de and google.fr index none (10/25/2002 12:45:44 PM)

Removed? Filtered? Server glitch? Liars?


A search at google.de for <the search> DOES NOT SHOW the site in question. I know - filtering is not removing ... pffffff.

[edited by: Brett_Tabke at 4:52 pm (utc) on April 7, 2004]

hutcheson




msg:83472
 4:12 pm on Apr 7, 2004 (gmt 0)

>I guess the distinction Google would make is between penalizing a site for some kind of real or perceived manipulation (which they do)

But, and this is the important bit, NOT BY HAND.

They remove sites by hand -- they have to be able to do that, and they promise to do it for specific kinds of SE-perverting.

But they don't adjust page rank or page relevance or page result order by hand. Ever.

No, removing one site will affect the PR of others. I presume they mean they don't remove one site to tweak others.

What they do by hand is constantly tweak the parameters to make what they consider the best sites to come up. They have to use specific test categories to evaluate their parameter adjustments. And the net effect is almost the same as if they were hand-tweaking the results IN THOSE CATEGORIES. But their purpose is to come up with parameter values that work across the board (or at least, across the entire bucket of similar search queries -- many people are arguing that there are at least two buckets now.)

digitalv




msg:83473
 4:18 pm on Apr 7, 2004 (gmt 0)

Am I the only one who read the article, or am I missing something? They said in the article that if they received 50,000 requests they would remove it from the search engine.

All they were saying about their "inabilities" was that they couldn't alter the POSITION in the searches artificially - they didn't say they couldn't remove it.

So what are you guys talking about when you're saying you know Google has the ability to remove sites. OF COURSE THEY DO, THEY EVEN SAID THAT IN THE ARTICLE.

[edited by: digitalv at 4:18 pm (utc) on April 7, 2004]

Fiver




msg:83474
 4:18 pm on Apr 7, 2004 (gmt 0)

Yidaki you're right. Even a site: at .de shows no indexing of the site.

So the article I linked to implies they 'add' to the index when a SERP doesn't meet ... someone's standards, and Yidaki shows them removing a site they don't want in there.

But otherwise, nope, can't change a thing manually.

webdevsf




msg:83475
 4:22 pm on Apr 7, 2004 (gmt 0)

Love the first 1 back link for site in question.

Google needs to move on this. Hiding behind their algo is just plain disingenuous.

Maybe somebody should subscribe these guys to a link farm! That'll move 'em down!

;)

jimbobway




msg:83476
 4:23 pm on Apr 7, 2004 (gmt 0)

Manipulating the search results for "suicide" is nice and all, but what about manipulating the search results for "how to make pipe bombs"? That would be more important I would think.

Once you make an exception for one case then it can cause more controversy.

One thing I noticed is that Google doesn't remove single sites. They remove sites with certain characteristics. For instance, if you had a redirect on your site then your ranking is significantly dropped even if you had no spam whatsoever. Google does this all the time.

Didn't they just remove all gambling sites too?

Fiver




msg:83477
 4:32 pm on Apr 7, 2004 (gmt 0)

Didn't they just remove all gambling sites too?

no that's just advertising, not natural SERPS.

All they were saying about their "inabilities" was that they couldn't alter the POSITION in the searches artificially - they didn't say they couldn't remove it.

So what are you guys talking about when you're saying you know Google has the ability to remove sites. OF COURSE THEY DO, THEY EVEN SAID THAT IN THE ARTICLE.

No, what we're saying is google says it does one thing, when it's clearly doing another. Have 50k signatures been gathered in Germany, is that why it's out of the index there? Is it a justifiably moral call to manually manipulate the SERPS related to suicide? What about terrorism? How slippery is a slippery slope?

I'm not making any judgment calls, I'm in no position, but I have to say it will be interesting viewing.

MultiMan




msg:83478
 5:41 pm on Apr 7, 2004 (gmt 0)

It all just goes to show how increasingly useless the G algo has become.

The SERPs in G for my topic do the exact same thing as the claimant in the news article.

In searches for my keyword, the SERPs in G are numerously peppered with all kinds of anti-keyword sites, even one-page sites, while G fails to provide the searcher with the numerous high-quality sites that are actually about the keyword. In fact, there's even a keyword.org site that is always listed as #1 or #2 despite the fact that is an ANTI-keyword site, is way outdated and not updated, and has few links from sites that actually deal with the keyword.

I've often wondered what would happen if this kind of thing happened with a more politically correct group or keyword such as in the news article.

I know I'll be interested in seeing how this plays out.

Pikin_It_Up




msg:83479
 5:43 pm on Apr 7, 2004 (gmt 0)

ANTI-keyword?

MultiMan




msg:83480
 5:51 pm on Apr 7, 2004 (gmt 0)

ANTI-keyword?

Replace keyword with keyword. :)

Pikin_It_Up




msg:83481
 6:00 pm on Apr 7, 2004 (gmt 0)

It looks pretty hacked to me...

Marval




msg:83482
 6:14 pm on Apr 7, 2004 (gmt 0)

I believe that GG remarked a few months ago during the beginning of an update thread, that there was some "manipulation" (I believe the example he used was the keyword that brings the most searches for a 3 letter word - didnt want to post it again Brett) where Google only wants to show G rated results and in the top 10 were 4 rather large "mature" sites. Those sites are still in the foreign results, but the English SERPs have been adjusted so that those sites appear around 400-650 spots down. It only affected that one keyword, as they all are still #1 for many other combinations of that word with modifiers. In this case it had to have been done manually, as an algo that looked for certain categories would have caught a few sites that still exist there

Michael Anthony




msg:83483
 7:03 pm on Apr 7, 2004 (gmt 0)

From the news.com article...

"Weinstock has launched an online petition asking Google to remove the site from its index. He said that if Google receives 50,000 requests to remove the site, it will comply. As of late Tuesday, the petition had about 2,800 signatures. "

and

"Google spokesman David Krane said the company's search results are determined by a complex set of algorithms that measure factors such as how many sites link to a given page. The company can't and won't change the ranking for Jew Watch, regardless of how many signatures the petition attracts"

Looks like even news.com can't get their facts straight? Either that, or Weinstock talked to someone else at Google who's yet to discuss the matter with Mr. Krane.

GoogleGuy




msg:83484
 7:25 pm on Apr 7, 2004 (gmt 0)

I walked over to see David Krane and asked him about it, because I had a hunch that David was talking about the results for this particular search (the word "jew") and not our overall system. And that's the correct explanation.

To give some background: people write us all the time to say that they dislike or disagree with a particular set of search results. For example, at one point someone wrote in and claimed that one of the search results for Martin Luther King was a revisionist history and wasn't accurate. Should Google go and remove that result by hand? Who gets to decide whether a result deserves to be in the top 10? You can see where the slope gets slippery really quickly when you start bringing value judgments about the content of the site into the mix.

So historically Google has very strongly tried to follow a policy of letting our algorithmic search results stand as they are; we put our efforts much more into improving search by writing better algorithms instead of trying to fix a smaller set of searches by hand. We have a quite small set of circumstances that can result in taking manual action: things like a valid legal request (e.g. a DMCA complaint), spam and things outside our quality guidelines (e.g. off-topic porn for a person's name), and a very small amount of security-related stuff (e.g. credit card numbers on a web page). Other than that, we do our best to let our algorithms work out the results on their own. I think that's the right approach, and I think most of our users would prefer that instead of lots of hand-editing.

Does that mean every search is perfect? Of course not. With 200+ million searches a day, there will be some searches that aren't as good as they can be. But when a bad search is pointed out to us, we look to how to improve our algorithms instead of doing some one-off change. That's the principle that's coming into play here.

If you go back and read the article, you can see that idea underlying it. I did a double-take at the second paragraph of the article: "Weinstock has launched an online petition, asking Google to remove the site from its index. He said if Google receives 50,000 requests to remove the site, it will comply." I have to wonder why Weinstock would say that (if he did). That did not ring true to me at all--I can't imagine anyone in a position of responsibility at Google ever saying anything like that. I don't like the first result for this search either, but we're not going to tweak the results for "jew" by hand. Now go back and re-read the fifth and sixth paragraphs of the news.com article. I think knowing Google's philosophy and a little more background puts the quote into context. It's hard to communicate all of these ideas that I've mentioned here with complete specificity and absolutely no ambiguity in three sentences, but I think David did a great job; he was saying that we won't tweak the results for this search by hand. I asked David to make sure that's what he meant, and it was.

hunderdown




msg:83485
 7:26 pm on Apr 7, 2004 (gmt 0)

Just a quick note because digitalv and another poster seem to me to have misinterpreted a statement in the original article:

"They said in the article that if they received 50,000 requests they would remove it from the search engine." (from digitalv's post)

Google said no such thing. The guy with the petition said they would do this. But later in the article, the Google spokesperson says something different, and I would take what Google says officially to be what's relevant in this case.

DVDBurning




msg:83486
 7:43 pm on Apr 7, 2004 (gmt 0)

While I agree that we should have a right to free speech, and while I understand Google's reluctance to start playing traffic cop, there are a number of intertwined issues here.

First - Does Google filter pages or sites manually? Sure they do.

Second - Should they remove objectionable sites from their index, and how do they define "objectionable"? This is kind of a political issue, but I would suggest that it is a business issue. If the overwhelming majority of Google visitors are disgusted by these sites, Google would be well-advised to filter them out one way or another. Look at MSN... they and others advertise their filtering capabilities, selling it as a value-added feature.

Ultimately, I think Google will do better if they continue to improve their ability to listen and speak with their customers... webmasters and web surfers alike. Googleguy showing up here at WW is outstanding. But I believe in the cluetrain manifesto.... you can't hide behind marketing slogans or rhetoric. Google doesn't want to discuss or reveal the secret sauce, but the questions and concerns won't go away until they discuss important policy elements openly.

This 125 message thread spans 5 pages: 125 ( [1] 2 3 4 5 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved