homepage Welcome to WebmasterWorld Guest from 54.146.190.193
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 157 message thread spans 6 pages: < < 157 ( 1 2 3 [4] 5 6 > >     
"Mr. Anti-Google"
Our own Everyman is on Salon!
GoogleGuy

WebmasterWorld Senior Member googleguy us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 5120 posted 5:22 pm on Aug 29, 2002 (gmt 0)

Haven't seen anyone else mention it, so I thought I'd point out that our own Everyman has an article in Salon today. The story has also been mentioned on geeknews and Slashdot.

The user comments are pretty negative, so I'll try to pull the balance back the other way. I always appreciate hearing Everyman's perspective, even though we've got different views of some things, e.g. how Google ranks internal pages from a site; I think we do a good job of that. If you haven't read Everyman's "search engines and responsibility" thread and his google-watch.org site, I encourage you to. That said, I do disagree with statements like "Eventually, a FAST-type engine should be administered by a consortium of librarians who are protected civil servants of a world government." :)

Anybody have thoughts on the Salon article?

 

brotherhood of LAN

WebmasterWorld Administrator brotherhood_of_lan us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 5120 posted 6:37 pm on Aug 30, 2002 (gmt 0)

Me too at everything Air said

This is the way I interpret Everyman stating that PR is "undemocratic".

Relating it to another PR (proportional representation)...a way of forming a govt. (used in Scotland) is thought to be a more precise way of representing power in government that the "first past the post system". It's not ideal, it just "minimizes" the problem of putting an entity in power that is not necessarilly endorsed by the majority of the people.

That analogy alongside what Everyman says. PR9 sites all other things being equal (like content) against a PR5 site....PR9 wins.

It's like saying a high PR is writing superior content, is more knowledgable, and is just plain and simply superior.

I thought the idea of a "democratic leader" (ie number one SERP) is that it is "first among equals". PR convolutes what is TRULY number 1! That would be the content.

Still, Google does the best job of empowering a SERP to #1 compared to others...and PR is what allows this to happen.

Spam doesn't help. My 0.02....seems being an anti-google person means having more issues than one! :)

przero2



 
Msg#: 5120 posted 8:06 pm on Aug 30, 2002 (gmt 0)

Google has become what Google is today because it provides what "many/most" (of course, not ALL) consider relevant search results. All this achievement in under 2/3 years speaks volumes on the practicality of their algorithm.

That being said, I would like to bring forth some issues/suggestions I have for Google:

1)Large sites (with 10000+ pages) have difficulty getting decent PR for the deep linked internal pages (most of them will have PR1 or PR0) and that rank will be pretty useless!. One suggestion might be the degradation of PR on a site should be uniformized depending on the number of pages a site has ... I cannot think why most of the internal pages on a well regarded PR7 site should end up having PR1 or PR0 ... This to me is a flaw in the pagerank computation algorithm. I suggest Google to have different pagerank degradation algos for internal links vs. external links on a page.
2)Google should clarify/state clearly on how the information collected from Toolbar, Cookies, etc. are being used. If they are mining the information and using that to personalize or better the SERPs, all such uses of the collected information should be stated so explicitly.
3)Caching the sites without a clear opt-in (vs. opt-out currently) should be seriously considered.

nutsandbolts

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 5120 posted 9:18 pm on Aug 30, 2002 (gmt 0)

Maybe it's me - but I don't see a problem. When I allowed the advanced toolbar on my computer I knew information would be moved to-and-from Google because I agreed to the terms and conditions. I trust them, almost completely with this data. Heck - they also have my credit card information as I use their Adwords system. My home address. My Web sites.

So this all seems rather simple to me: If the advanced toolbar is scary - don't use it. If the Web page Cached options offend Webmasters - stop Google from caching your pages.

All I know is - I love the cache option when I search. I love the green Pagerank indication and I trust this company.

Of course, I would prefer a Pagerank of 9 on all my sites, top ranking on singular keywords for every domain etc etc ;)

KMxRetro

10+ Year Member



 
Msg#: 5120 posted 9:38 pm on Aug 30, 2002 (gmt 0)

Not one to be stupid, but if EVERY site of say, more than 1,000 pages had all of its deep content stuck up at PR9, JUST for being content, then who would lead?

How would we define which is the "better" page? Or even, which is the more relevant? You'd have all pages on PR9 and we'd be back to random ordering, like other engines do.

Google is attempting to provide an ORDERED, STRAIGHT-FORWARD way of searching. They are doing it for FREE. They charge NOTHING to list you.

Where's the problem?

I find Google's listings to be extremely fair. My site launched in March and had a PR of 5 off the bat. Two months later, it was PR6, after gaining some more content and lots more links....in this update, it seems to have settled at PR7. Again, more content, more relevant inbound and outbound links.

At the moment, my index page is ranked highest, my reviews are next in line and this is because they are linked to almost as much as my index page. The rest is in a puddle at the bottom of the index. I haven't optimized them or generated links for them yet.

Where's the problem?

Sure, my PR7 *could* be a PR9, hell, it could be a PR10, but it isn't. I live with it, I work harder and harder on the site in the hope that things will improve even more.

What I DON'T do is spend time away from my site, making up stories that amount to:-
"Google didn't give my 5,001st auto-generated database page as high a PageRank as it did for a page created manually...I'm telling mom that Google stole my ice-cream and watches me when I sleep."

Get over it. Create some content of your own instead of having a database of facts and figures gathered from other publications and you might get listed higher, yeah?

mivox

WebmasterWorld Senior Member mivox us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 5120 posted 9:50 pm on Aug 30, 2002 (gmt 0)

Get over it. Create some content of your own instead of having a database of facts and figures gathered from other publications and you might get listed higher, yeah?

If the purpose of your site is to provide a central repository of hard to find research information and citations, it would be kind of silly to replace all the compiled research data with "content of your own".

That's rather like telling a library that the card catalog would be a lot more entertaining if it had lots of pictures, multi-media content and removable coupons for coca-cola products instead of all those boring book cards. The boring book cards are the whole point of the catalog.

But, does its lack of entertainment value make the card catalog (or its computerized equivalent) a less valuable research tool? No. If someone walks into a library looking for the card catalog, it should be easy to find, because it is a valuable repository of information.

Burying a research database's content with pr0 pages is somewhat like a librarian saying, "What do you want with that boring card catalog anyhow? If you're looking for catalogs, this Eddie Bauer catalog is much more interesting to look at! We know it's better, because everyone's heard of it!"

The point of the article isn't "Hey! My site isn't ranked high enough!" (duh!) The point of the article is "Hey! Google's system is flawed, and with their incredibly dominant position in the current internet search environment, those flaws are worth looking into!"

makemetop



 
Msg#: 5120 posted 9:59 pm on Aug 30, 2002 (gmt 0)

>My site launched in March and had a PR of 5 off the bat. Two months later, it was PR6, after gaining some more content and lots more links....in this update, it seems to have settled at PR7.

My home page launched in 1999 an had high PR in the early days, then went to PR0 and is now a PR5 - I have lots of links from PR5+ pages. Internal pages are higher.

Without wishing to comment too much on the issues here - realise that Google can (and does) manipulate PageRank according to their own wishes. Nothing too democratic there. However, it is their right to determine who ranks and doesn't - so no complaints from me :)

KMxRetro

10+ Year Member



 
Msg#: 5120 posted 10:04 pm on Aug 30, 2002 (gmt 0)

"Brandt is not a disinterested party; the dispute between Daniel Brandt and Google is personal. He has spent thousands of hours building a Web site that he believes is both useful and important, and Google, in its algorithmic blindness, has given Brandt a lower page rank than he thinks he's entitled to. Brandt finds it genuinely hard to believe -- and even personally insulting -- that Google won't give him more credit."

I think that the article is very much about "my site isn't ranked high enough"

Filipe

10+ Year Member



 
Msg#: 5120 posted 10:04 pm on Aug 30, 2002 (gmt 0)

Without wishing to comment too much on the issues here - realise that Google can (and does) manipulate PageRank according to their own wishes. Nothing too democratic there.

Are you suggesting they have an agenda?

24bit

10+ Year Member



 
Msg#: 5120 posted 10:10 pm on Aug 30, 2002 (gmt 0)

I Love Google! I think they could improve by:

1. If googlebot doesn't pick up an entire site that's currently indexed during the crawl, then it should keep it indexed another month assuming the website's sever was down or ???? That way we could eliminate the gray bar drop out from host server problems, etc.

2. When someone's site gets a penalty, have an automated email sent out to verify this. That way people will know if they actually have one rather than scratch their heads for months.

3. It would be nice if we could see all of our backword links.

I love the toolbar, and I like just about everything else about Google.

przero2



 
Msg#: 5120 posted 10:11 pm on Aug 30, 2002 (gmt 0)

KMxRetro, Good for you that your newbie site is doing so well. Hope you did not spend tons of time spamming for links;).

Yeah, having 1000+ pages stuck at PR9 will not make sense. Assuming the site is a truly PR9 or PR10 quality, seeing 80% of the sites' pages at PR0/PR1 pages does not make sense either. On a PR9 quality site, I would expect that a deep linked page has relavance and deserved to rank high (may be at a minimum of 3 or 4). As an example, look at Yahoo (PR10 site) which has huge number of PR1/PR0 pages for a number of regional/relevant pages.

As I said, Google is great. But let us not think it is perfect and does not need improvements. I am sure, Googlers are constantly improving their algo and I would be happy to see them consider a scaling down of PR on internal pages to the total number of pages on site.

I also really don't care if a page is hand coded or generated using good publishing systems. What matters is the relevancy and usefulness of the page to the end user and that is what I believe Google or any SE should care about!.


mivox

WebmasterWorld Senior Member mivox us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 5120 posted 10:40 pm on Aug 30, 2002 (gmt 0)

"...Brandt finds it genuinely hard to believe -- and even personally insulting -- that Google won't give him more credit."

I think that the article is very much about "my site isn't ranked high enough"

If you read Brandt's first post in this thread responding to the article (look for "Everyman"), you'll notice that what information the article's author chose to present, and the point Brandt himself is trying to make with his new site (google-watch.org - the site that prompted the article) are two very different things.

It's a common occurence in journalism: Someone gives a very long interview on subject "A", and later reads the press piece to discover the writer spun the entire article based on the 5 minutes they spent talking about subject "B".

I'm more inclined to believe Brandt's own words about his intentions, than I am to take on faith the spin put on an interview by a writer who apparently had his own - very different - axes to grind.

Why don't you read the articles Brandt wrote himself, on his own site, before you claim to know what point he's trying to make with the whole thing?

Filipe

10+ Year Member



 
Msg#: 5120 posted 10:44 pm on Aug 30, 2002 (gmt 0)

I also really don't care if a page is hand coded or generated using good publishing systems. What matters is the relevancy and usefulness of the page to the end user and that is what I believe Google or any SE should care about!.

I speak this reluctantly, being a hand-coding purist, but this is totally true. Expanding on this point, what matters is, arguably and briefly:

- Relevancy (this is touchy business with the shortcomings of language, but as we become smarter searchers, the better we are at projecting meaning through a few words)
- Usefulness (this is very topic-specific. It could be information, which I love, or tools, or multimedia, depending on what the site's audience is)
- Ease of use (sites can have lots of good information, but it can be hard to read, hard to access, or hard to understand)

Filipe

10+ Year Member



 
Msg#: 5120 posted 10:50 pm on Aug 30, 2002 (gmt 0)

<addendum>Regardless as to how Google executes it, I think, for an attempt at an objective automated ranking system, PageRank is brilliant - much moreso than any other method developed yet.</addendum>

Filipe

10+ Year Member



 
Msg#: 5120 posted 11:09 pm on Aug 30, 2002 (gmt 0)

Just to debunk, or at least post some evidence against, Mr. Brandt's claim's that new sites can't be found ("if it's on a new site, you won't find it."). If you go to Google, run a search on "<snip>search phrase</snip>" (with or without quotes) the site that comes up first is "<snip>site name</snip>" (I won't post the URL).

That site was optimized by a friend of mine, and as soon as it entered Google (it had only been up 2 months prior to that) it was already ranked at #3 among its 110,000 competitors for that spot. In the 2 months since it arrived on Google, it's risen to #1.

Most would agree, in general, it's a content rich site and useful for people looking for <snip> what site offers</snip> - and they got there for free - they didn't have to pay Google off like they would other engines.

[edited by: heini at 11:33 pm (utc) on Aug. 30, 2002]
[edit reason] no specifics on sites please - thank you [/edit]

sean

10+ Year Member



 
Msg#: 5120 posted 11:13 pm on Aug 30, 2002 (gmt 0)

- Air -
Maybe we should just be annoyed for having a world that is what it is.

That is everything in a nutshell. If someone does not agree with fundamentals of PageRank, they are unlikely to agree with a purely democratic one-surfer-one-vote search engine.

scareduck

10+ Year Member



 
Msg#: 5120 posted 11:57 pm on Aug 30, 2002 (gmt 0)

This guy is a crank with an axe to grind. My favorite part of his rant is that Google should be a public utility. Translated, he means to take over operation of Google without giving them anything in exchange for this. It's how third-rate dictatorships operate, and for the same reasons: the guy who says what goes gets the power, the money, the chicks, all of it. He really doesn't have any serious ideas about how to index the web himself (listen to his boneheaded "it ought to be deemphasized" without supplying anything better), so of course he wants to take Google over. Why is this of interest, please?

mivox

WebmasterWorld Senior Member mivox us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 5120 posted 12:26 am on Aug 31, 2002 (gmt 0)

My favorite part of his rant is that Google should be a public utility. Translated, he means to take over operation of Google without giving them anything in exchange for this.

Nonsense. He said no such thing. He said that a non-commercial search engine administered by a consortium of librarians, answerable to an international governing body was a good idea for the future.

He didn't say he wanted to take over Google. Trying to put words in his mouth is nothing but a scaretactic, scareduck ;) (one the Salon writer does not seem to find beneath his own journalistic standards).

I've discussed this issue with him privately, before the google-watch site went live. Whether or not you agree with his concerns, he's not arguing for the overthrow/takeover/destruction/coercion-by-evil-commie-overlords of anything or anyone at any time.

Filipe

10+ Year Member



 
Msg#: 5120 posted 2:23 am on Aug 31, 2002 (gmt 0)

I want to thank all the moderators for keeping everyone in check. I personally don't agree with Mr. Brandt in the slightest, because I don't think he provides an organized argument that is sound, much less valid. I would like to see the argument laid out with the evidence supporting the premises, which, as current, I don't think they do so on good grounds. However, when people take things out of context (I'm guilty of it, I know) or get belligerent, we're just slowing down anything resembling a conclusion to this thread.

So all you "This guy is a @#$%@% nut" posters, I understand your position, but try not to load your argument with value judgements or else what you're trying to say just falls apart.

Interesting thread, huh?

shelleycat

10+ Year Member



 
Msg#: 5120 posted 3:02 am on Aug 31, 2002 (gmt 0)

Wow, so much interesting stuff to think about!

I have found the discussions about privacy issues particularly interesting. When I installed the google toolbar last year sometime I read the privacy notice carefully and had a full understanding of what turning on the advanced features meant. It used strong enough language (something like 'serious privacy implications') to make me read and take note. So I'm not worried about that because it was my choice. And I can turn off the advanced options any time and keep using the tool bar for searching, which is what I use it for most of the time anyway.

The caching however is a little different. Everyone making an original web page gets copyright over the page automatically whether they put a notice on or not. You have to specifically give up your rights to lose copyright protection. (I know sometimes employers retain copyright etc, but the point is someone gets it). However, with caching being opt-out it appears that google assumes you don't have copyright unless you ask for it (ie a no-cache tag). This is the opposite of how I've been lead to believe copyright protection should work.

I've never put a no-cache notice on a page because I didn't know that google were even caching my webpages, and certainly didn't know it's possible to stop them (til I started reading here). I also didn't put copyright notices on my pages for a long time because I didn't know I was allowed. However I still didn't want someone ripping off my work. My ignorance shouldn't take away my rights.

I know there are complications such as ISP caching etc but google displays the cached page as part of their system, ie with the nice coloured logo at the top. This strikes me as an important difference. I can imagine that a switch to opt-in now would lose them a lot of cached pages and would be difficult to do. Personally I'd like to see the change.

estjohn

10+ Year Member



 
Msg#: 5120 posted 8:37 am on Aug 31, 2002 (gmt 0)

mivox: "a non-commercial search engine administered by a consortium of librarians, answerable to an international governing body"

Although I have my issues with google, that idea is completely absurd.

You would have the UN running a search engine.

pvdm

10+ Year Member



 
Msg#: 5120 posted 10:33 am on Aug 31, 2002 (gmt 0)

I'd like to add some thoughts to this very interesting thread, but there are so many things to think about. As we can't discuss it all in one thread, I'd like to consider only the PageRank aspect. Let's 'abstract' the discussion to a higher level and forget about individual situations and sites. It doesn't matter whether Mr. Brandt has a personal agenda or not. His views on PageRank reflect thoughts and critics that are similar to many other's.

Of course Pagerank isn't perfect. There is a long way to go before an 'excellent' way of searching + ranking will be achieved. At least PageRank has the advantage to be better than the actual (large scale) alternatives. Yet. But things can change quickly these days. Where was this thread about the patents of IBM regarding P2P search algorithms?

I am confused about contradictory statements:
1. 'PageRank favourizes the big sites'. Is that so? Precisely the big sites have difficulties getting their deeper pages PageRanked higher. Where is the favour? 10.000 pages with no real Pagerank can't add much PageRank to a home page. So adding page ad infinitam is not really a clever way to play the game long term. Of course, if one publishes a site with a billion pages, statistically, you'll have more chances to see a page of this site than others. But that is statistics, not Pagerank as it is running now. And that's a good aspect of PR like it is now.
2. 'PageRank puts big sites at a disadvantage'. Is that so? It's normal that EACH new page starts at near zero, and build it's own PR by external quality links. A big site publishing 10.000 pages in a database should not get more Pagerank for each of those pages than my 5 pages published at the same time, if it had the same content. If a big site would get more PR because it has more pages, this would just motivate people to build sites containing millions and billions of worthless database generated pages. That would be absolute nonsense.

This whole discussion concerns the 'neutrality' of PageRank's calculation regarding small and big, old and new. I agree that in some cases PageRank favourizes established, big sites. But in other cases, it favourizes small optimized pages published by one individual. Pagerank will get better, and in the long term, I believe it will achieve better results.

Finally, I don't think PageRank is the only way to go. There are many, many other interesting variables that could be inserted and the algorithm. And the algorithm could even adapt itself to the type of search.

Semantics, thoughts, languages and meaning of words are among the most complex things one can study. In my opinion, actual search technology is just starting to discover how complex it really is. Giving a good computer automated answer or list to a human question or search, has a very, very, very long way to go, even with 100.000 multi-processing-hyper-threading computers...

I am sorry if my thoughts are not of high practical value in this thread, but I hope it opens other insights.

egomaniac

10+ Year Member



 
Msg#: 5120 posted 4:09 pm on Aug 31, 2002 (gmt 0)

>He said that a non-commercial search engine administered by a consortium of librarians, answerable to an international governing body was a good idea for the future.

This is a very bad idea.

The costs of this would be huge for one or more of three reasons. Either 1) we would have to spend a massive sum of taxpayer money to out-engineer Google on Search technology, 2) we would have to spend huge taxpayer dollars on a staff of international "librarians", or 3) we would have to spend massive tapxpayer advertising dollars to keep an inferior search engine in the public awareness.

Without spending on #1, then it would be an inferior engine and #3 would be assured. #2 is not only expensive, but it would be extremely biased towards the politics of the governing body charged with running it, because they would be the employers of the so-called "librarians".

This is a pipe-dream typical of liberal elitists who live in the clouds "Let's fix the unfair, undemocratic search of Google with an unbaised world government run search engine". Such an idea is B.S. and it won't work because the idea is fundamentally flawed. Google works because it is like capitalism (see my earlier post in this thread). And capitalism works because it leverages true human nature.

lawman

WebmasterWorld Administrator lawman us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 5120 posted 4:22 pm on Aug 31, 2002 (gmt 0)

>>mivox: "a non-commercial search engine administered by a consortium of librarians, answerable to an international governing body"
Although I have my issues with google, that idea is completely absurd.

You would have the UN running a search engine.

estjohn:

Go back and read Mivox's post - a little more slowly this time. :)

lawman

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 5120 posted 3:23 am on Sep 1, 2002 (gmt 0)

Lawman: Go back and read Mivox's post - a little more slowly this time.

I read the post 4 times lawman. Very well stated, and I'm reminded of S.I. Hayakawa's classic "Language in Action", where he encouraged his students to read everything with an eye to whether it's expository writing or journalism from an objective point of view, or whether there's bias or slanting.

GoogleGuy
If you haven't read Everyman's "search engines and responsibility" thread and his google-watch.org site, I encourage you to.

As GoogleGuy suggested, I'm looking at how Daniel presented his ideas for discussion in Social responsibility of search engines [webmasterworld.com].

...social ethics of search engines and/or websites, quite distinct from the commercial aspects of SEO and webmastering.

That's what sticks in my mind as being the underlying philosophy and the motivating force.

argusdesigns

10+ Year Member



 
Msg#: 5120 posted 5:52 am on Sep 1, 2002 (gmt 0)

I tend to agree with pvdm, especially the summary. I think that page rank is constantly improving. Although we are talking about a bot, that hands out page rank; the fundemetal thing to remeber is that the algos are input by humans. It is, lets say the bridge between the rapid changing internet content, and the human element. It will get better... that's why we have an eraser at the end of our pencils. ;)

Everyman



 
Msg#: 5120 posted 6:12 pm on Sep 1, 2002 (gmt 0)

I feel that Google is at an important juncture at a crucial moment in Internet history. They are in a position to make a statement about the direction of privacy and the Internet that no one would be able to ignore for many years to come. And it's almost to the point where making no statement at all is itself a powerful statement.

Since a couple years ago, when I first noticed that Google was perhaps the first search engine with the audacity to use cookies that expire in 2038, I've noticed that over these two years, other engines have started doing this also. Why not? If Google does it, and everyone loves Google, and no one criticizes Google, why should anyone use cookies that are tailored to the sensitivities of public-sector advocates?

In the same way that Google is now setting the pace by ignoring privacy issues, they could be setting the pace on ethical standards for search engines for many years to come.

Let's not discuss the cookie itself. I know the technical aspects of what a cookie can and cannot do. What interests me is Google's overall cookie policy, which in a nutshell is to set a maximum cookie with a unique ID at every opportunity. If the user already has a Google cookie, then they record this ID at every opportunity, along with everything else available at the time of that transaction.

Is this harmful? Beats me -- I don't know what the future holds. Is it conspiratorial? Probably not. I think it has more to do with something I call "the social insensitivity of nerds." Google is a very nerdy company. They don't see the world in terms of social, political, and philosophical issues. Even their public relations people, who are good at their jobs, don't have a clue about social issues. The last thing Google would do is to hire humans to evaluate and rate sites that may otherwise get crushed by PageRank. Their entire attitude is that if a software engineer with a Ph.D. cannot do it with a clever recursive tweak using vector math, then it's not worth doing at all.

I see Google's cookie policy as an indicator of Google's sensitivity to public-policy issues. As long as Google is insensitive to such issues, then I will continue to use their cookie as a demonstration of Google's insufficiency in this area. I'm curious to see how long it takes for Google to realize that sensitivity to social issues might even be good for business. For example, what would be the harm in a self-renewing 20-day cookie as opposed to a 36-year cookie, in terms of Google's bottom line? Completely negligible -- unless you assume that there's more going on than we know about, and become a conspiracy theorist.

Google should overhaul their entire approach to social issues. They should get involved in discussions -- such as various social-policy committees at the American Library Association have been doing for decades -- that air these issues.

They should designate a privacy officer. They should try to organize an advisory committee of non-nerds who can recommend policies on data retention. They need to solicit advice on their legal position with respect to new laws that may affect them, including both copyright issues (opt-in for cache copies, for example), and access to Google records by law enforcement without probable cause. They should post their conclusions and policies, and invite comment. There should be more explicit guidelines for webmasters and a penalty appeal process.

Right now, they are still in a "let's see what we can get away with" mode, and they're silent and secretive about anything that's important, whether it's their algorithms or their public policies. So far this attitude has worked for them, primarily because they have a knee-jerk, cult-like following that believes Google can do no wrong. As one of the only survivors of the nerd-crash of the last few years, Google naturally enjoys the affection of all nerds. Google proves that nerds can make a difference.

Google is so important these days, that other search engines would immediately follow suit if Google woke up to these issues.

If for some reason (I can't imagine what that reason would be), Google is completely adverse to considering these issues, then this could be a clue that Google needs watching. That's the entire purpose of trying to force these issues into the public consciousness.

przero2



 
Msg#: 5120 posted 7:06 pm on Sep 1, 2002 (gmt 0)

Everyman, just going by Salon article without an in-depth reading of your points of view, I would have dismissed your points easily. The more I read and understood your points of view in this forum and other forums on WebmasterWorld, I begin to appreciate your points of view and soundness of your thought provoking statements and just causes. I pray God that Google considers some of these and makes sound socially ethical & business decisions and transform out of the childhood and nerdy types of its founders;)

cminblues

10+ Year Member



 
Msg#: 5120 posted 7:18 pm on Sep 1, 2002 (gmt 0)

I totally agree with the last Everyman's post.

A big issue with GoogleGuys, is that they really don't seem be touched at all by issues not strictly relevant to 'algos democracy'.

I think they're all a very good bunch of hackers, and this is the reason Google works so fine hehe.

But, a little bit of P[ublic] R[elation] efficiency would'nt be a bad idea..:)


KMxRetro

10+ Year Member



 
Msg#: 5120 posted 7:27 pm on Sep 1, 2002 (gmt 0)

To quote Everyman: "what would be the harm in a self-renewing 20-day cookie as opposed to a 36-year cookie, in terms of Google's bottom line?"

It certainly would mean nothing to Google. But with that said, it wouldn't mean anything to us either. If the cookie self-renews and I boot up my PC every 20 days for the next 36 years (and Google is still around), won't I still get a new cookie at the end of the 36 years? Or will something magically stop it?

20-day self-renewing cookies would technically give Google the opportunity to store a cookie on your machine FOREVER. And it would still hold the same data.

At the end of the day, what can Google actually get out of us? So I search for a videogame site. So what? Sure, they can target a few ads towards me, but thats about it. Unless the cookie magically gets into my wallet, takes out £550 and buys all three of the major games consoles at Google's partner site, then I'm not worried.

PaulPaul

10+ Year Member



 
Msg#: 5120 posted 7:32 pm on Sep 1, 2002 (gmt 0)

I also agree with pzero2.

But Everyman, this is the real world. And performing such a massive operation, such as Google does, must be completely automated. If you would introduce a human aspect to the Search Engines, you would need a warehouse the size of 3 home-depot's to house all the tech's to check sites for all the high keywords. You would also end up charging $300 per site just to pay your staff. If not, you would end up with a DMOZ, type site, which IMO without the help of sites like Google, would be long dead. I personally have never searched for information using DMOZ, because I know I will be returned a NO results found. That being said, I havent searched for anything informative on anything other than Google in at least the last 6 months, when before I used DogPile, and Yahoo.

Why, because information in Google, is on topic, delivered immediately, and most importantly new sites and web pages are added often. As we know, normally within a month if not sooner. None of this would be possible if there was human intervention.

This also inevitably, leads to some sort of abuse, and technology savvy sites, sometimes learn how to hack the algo. IMO, all that can be done, is continuously tweak the algo to keep up with the latest technology. This way over time the algo should keep getting better and better.

IMO, this is why Google is what it is, and the other search engines are what they are. Not to sound mean, but Everyman, why not hire a SEO company, or individual to help get your site higher in the SE's?? I have seen much worse sites, very high in the SE's. With some work, you might have achieved the success you were looking for. Or you could do it this way, I guess ;)

Paul

estjohn

10+ Year Member



 
Msg#: 5120 posted 7:58 pm on Sep 1, 2002 (gmt 0)

Everyman, in the last post and previous posts you state a few items almost all people would agree with. Then you leap to making several quotes revealing a very bizarre, wrong-headed ideological root to your arguments.

Some of what you say is agreeable, such as a need for ethics in business, including Google. I doubt anyody disagrees with that...it's a bit of a red herring. Also, I think most people would agree they should be socially conscious, after all, that's good business, be considerate of the community you are doing business in (although I'm not sure that should be legislated).

Based on that, you then leap to this:

"Eventually, a FAST-type engine should be administered by a consortium of librarians who are protected civil servants of a world government. Or at least they ought to belong to the American Library Association, or something similar." April 13, Everyman, Webmasterworld

Given agreement on the assumptions of ethics in business, I don't see why this necessitates a search engine being under world government or a librarian association. Maybe you can clear that up.

I doubt your "social insensitivity of nerds" theory could be backed up with any data. However, I strongly disagree from an anecdotal standpoint. Most geeks (nerds as you call them) I know are actually very socially aware and conscious, more so than your average citizen. You are pepetuating a very ignorant, non-socially-conscious stereotype to say that geeks are not social and only care about their math or programming problems.

They are often more scientific and logic-minded in their approach but I don't think that precludes being ethical or socially conscious, do you?

This 157 message thread spans 6 pages: < < 157 ( 1 2 3 [4] 5 6 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved