homepage Welcome to WebmasterWorld Guest from 54.226.180.86
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 77 message thread spans 3 pages: < < 77 ( 1 [2] 3 > >     
Google Buys Search Algorithm Created by Israeli Student at Australian University
Orion algo rates the texts by quality of the page and the site
BroadProspect




msg:744623
 8:25 am on Apr 9, 2006 (gmt 0)

from the israeli news paper : [haaretzdaily.com...]

Search engine giant Google recently acquired an advanced text search algorithm invented by Ori Alon, an Israeli student... Orion, as it is called, which Alon developed with faculty, relates only to the most relevant textual results..."For example, if you search information on the War of Independence, you'll receive a list of related words, like Etzel, Palmach, Ben-Gurion," he explained. The text will only appear on the results page if enough words relevant to the search and the link between them is reasonable.

<quote reduced to 4 sentences
See Terms of Service [webmasterworld.com]>

[edited by: tedster at 5:56 pm (utc) on April 9, 2006]

 

walkman




msg:744653
 4:07 pm on Apr 10, 2006 (gmt 0)

>> Anybody knows what patents are worth these days? I'd bet he's gotten 5 figures tops.

you must be kidding, or at least should be. Are all patents created equal?

aleksl




msg:744654
 6:16 pm on Apr 10, 2006 (gmt 0)

>> Anybody knows what patents are worth these days? I'd bet he's gotten 5 figures tops.

walkman: you must be kidding, or at least should be. Are all patents created equal?

No I am not kidding. A patent is worth a lot when it can make business money, or protect business from competition. Patent like "some algorythm", unless it is an E=mc2 or DNA -level discovery, is not worth a penny - unless it can be used to generate $$$. Companies are worth millions. Some subroutine that stacks words in proper order, no matter how fast, can't be on its own worth millions - by definition.

From where I sit this looks like Google's legal team trying to create/protect the playing field by buying out a patent (pending, I understand) that they were late to place. Kinda what Yahoo! was doing sometime in the past to get their patent portfolio. Which is nothing revolutionary.

Again, I may be waaay off.

walkman




msg:744655
 6:29 pm on Apr 10, 2006 (gmt 0)

>> No I am not kidding. A patent is worth a lot when it can make business money, or protect business from competition. Patent like "some algorythm", unless it is an E=mc2 or DNA -level discovery, [i]is not worth a penny - unless it can be used to generate $$$. Companies are worth millions. Some subroutine that stacks words in proper order, no matter how fast, can't be on its own worth millions - by definition.

since you know the value, I must ask: have you seen the algo in question yourself? If not, how do you its value? Also, if you're so smart, why don't you write a "subroutine that stacks words in proper order" and start your own Google? They were plenty of search engines when G started too, so don't let that stop you; you could make tens of billions of dollars (Google is worth $120.91 Billion today, at 2:15PM ET.)

digitalghost




msg:744656
 6:52 pm on Apr 10, 2006 (gmt 0)

Anyone seen the patent application number? Why do journos always fail to ask? ;)

whoisgregg




msg:744657
 7:03 pm on Apr 10, 2006 (gmt 0)

also lists other topics related to that particular keyword

Hmm, this may conflict with some Gigablast technology. From [gigablast.com...] ...

Related Topics Dynamically generated on a per query basis. (aka Gigabits)

peter andreas




msg:744658
 9:53 pm on Apr 10, 2006 (gmt 0)

Just seen this story in The Sydney Morning Herald, it looks like the University of New South Wales (in Sydney) has been paid by Google and they are speculating how much this guy will get

[smh.com.au...]

walkman




msg:744659
 10:04 pm on Apr 10, 2006 (gmt 0)

Hmmm...this might actually suck:

[blogs.smh.com.au...]
"My understanding is that the end user will actually be able to see the result of your research because part of what Orion is about is returning extra information to the searcher on the return search page.

Yes, that is correct.

7. Do web publisher have to fear this because it means that searchers won't have to click on their sites to get the information?"

Whitey




msg:744660
 10:44 pm on Apr 10, 2006 (gmt 0)

Again - just for the record , there is a fair amount of collaborative support for innovation amongst high tech "hopefuls" in Australia and NZ . The infrastructure and education standards are comparable to anywhere in the World - and it's a pleasant environment to be working in!

Participants include the various universities , State and Federal Governments and of course business' acting with researchers and entrepreneurs, where inputs can be a combination of early funding, cost offsets and expertise.

The successful projects are often brought to market with a background of "helpers" skilled in issues surrounding commercialisation such as IP protection, legals , fund raising , management. "Market" often involves going to the US where the big money & development is.

From time to time mention is made of several search engine developments which may/may not be complimentary to established players

Some examples/information on support can be seen at
[atp.com.au...]
[atp-innovations.com.au...]
[ausindustry.gov.au...]

I guess relevance will always be the key to a better search engine ...... what happened to "HillTop" and the use of "DMOZ" listings?

JollyK




msg:744661
 10:59 pm on Apr 10, 2006 (gmt 0)

Who cares about the money? He gets to work at Google!

How much is THAT worth? :-)

JK

annej




msg:744662
 11:38 pm on Apr 10, 2006 (gmt 0)

The bit about returning extra results concerns me. At the same time that Google will be watching how long people stay on a page it sounds like this algo will get people moving through webpages faster if they visit the web page at all.

oldpro




msg:744663
 12:37 am on Apr 11, 2006 (gmt 0)

A patent can be legally challenged as well as defended. The winner is such a contest is the one with enough greenbacks outlast the other. Google patents algos not to obtain gov't paid mercenaries to fight their battle for them...they only do so for the basis to tie up any arising competitor in the court system indefinitely. Whether it is google or any other patent holder...it is nothing more or nothing less. Worth has nothing to do with it.

Granted Orion is not a search engine, but what is a search engine without an algo? It is a major piece of the puzzle...money will take care of the hardware.

With that said...I still say 'to bad the kid sold out', but if I were in his shoes I would have probably done the same thing if it were enough money to sock away and live comfortably and peacefully the rest of my life.

Most likely his Orion will never see the light of day...If google is the collective Einstein as everybody thinks it is...they will deep six it. What benefit are perfect search results to the PPC or CPC models?

Whitey




msg:744664
 3:36 am on Apr 11, 2006 (gmt 0)

Some more revelations on this subject :

advanced text search algorithm [smh.com.au]

Today he's working out of Google HQ in Mountain View, California, with the world at his feet.

While Mr Allon gets the credit, he doesn't own the process.

That is retained by the UNSW, which can expect to hear the sweet sound of a steady flow of royalty cheques if and when Orion becomes fully integrated into the primary Google search engine.

The project was funded by a $150,000 grant from the Australian Research Council, a Federal Government body that backs promising research projects.

The title of the project that was put up for funding is: RichProlog, a System for Deducing, Inducing and Learning in the Declarative Programming Paradigm.

According to ARC records, the aim of the project is to "contribute to bridge the gap between learning and logic, theoretically and practically", "extend considerably the scope of the declarative programming paradigm" and "build a system that can be used to solve learning or discovery problems as encountered in Artificial Intelligence".


Tastatura




msg:744665
 5:30 am on Apr 11, 2006 (gmt 0)

Hmm..I am in the camp with people who, at this moment, are not happy about this (see below). Wouldn’t this be form of ‘framing’?

as walkman pointed out:

Hmmm...this might actually suck:
[blogs.smh.com.au...]
"My understanding is that the end user will actually be able to see the result of your research because part of what Orion is about is returning extra information to the searcher on the return search page.

Yes, that is correct.

7. Do web publisher have to fear this because it means that searchers won't have to click on their sites to get the information?"


digitalghost




msg:744666
 5:49 am on Apr 11, 2006 (gmt 0)

No, but a lot of what I've read in this thread is speculation. ;) I can't find the patent application or specs, but speculating that Google wants to quash the work, or frame sites, or make it unnecessary to navigate to sites of origin is a bit extreme.

Did Google breed this climate of fear? Or are webmasters a naturally fearful bunch?

jimphilli




msg:744667
 6:03 am on Apr 11, 2006 (gmt 0)

Most likely his Orion will never see the light of day...If google is the collective Einstein as everybody thinks it is...they will deep six it. What benefit are perfect search results to the PPC or CPC models?

Oldpro, I think you got it right. They probably bought the algo primarily so that their competitors would not get it. Doing so secures their market position. After they feel secure, their next question is: "what does this do to increase revenue"? If the answer is that it might decrease revenue, then this will never see the light of day. Certainly they won't make any big changes fast.

They are number one right now, and don't want to screw anything up. They make a product which the public now likes much better than any of their competitors. So any changes will be slow in coming, at best.

oodlum




msg:744668
 8:46 am on Apr 11, 2006 (gmt 0)

[heraldsun.news.com.au ]

Instead of finding pages on the net that contain keywords, then providing links, the new search engine will provide expanded text extracts which will eradicate the need to open every link.

Dr Martin, who has supervised Mr Allon and helped develop Orion, said the search engine tool would make net surfing "much easier, and much less frustrating".

"You won't have to click and see if what you're after is in this webpage, and go back and forth again and again," he said.

"This will give the information directly and immediately. It will be a great time-saver for users."

Count me worried.

Whitey




msg:744669
 11:11 am on Apr 11, 2006 (gmt 0)

It will be 18 months before they finish , then Google has to see if after all this work they will use it.

Orion Algorithm [heraldsun.news.com.au]

"You won't have to click and see if what you're after is in this webpage, and go back and forth again and again," he said.

"This will give the information directly and immediately. It will be a great time-saver for users."

Dr Martin said the project, which started in March last year, would be finished in the next 12 to 18 months

As I asked before, does anyone know what happened to the Hilltop Algorithm?

What I'm suggesting is that this algorithm may never be used, be superceded , be strategically purchased ie taken out of the market place, or may need extensive ongoing research.

One thing we can be sure of - Google wants better relevance, faster search , cheaper independent methods of categorisation [ DMOZ died long ago ] and unique , quality content.

It also wants commercial sites to be encouraged to operate through a fee paying process e.g. Adsense , Google Shopping Carts.

So Google's march will be relentless in seeking to achieve dominance in this area, so as to ensure it is the user's first choice.

I think the webmaster's ongoing objectives are to anticipate these challenges from a content/definition point of view - notwithstanding the constant upgrades and associated problems that these new levels bring.

tulkul




msg:744670
 12:19 pm on Apr 11, 2006 (gmt 0)

[abc.net.au...]

europeforvisitors




msg:744671
 12:26 pm on Apr 11, 2006 (gmt 0)

After they feel secure, their next question is: "what does this do to increase revenue"? If the answer is that it might decrease revenue, then this will never see the light of day.

Just because many Webmasters are prone to short-term thinking doesn't mean Google is equally foolish. The answer to what this (or other new search technologies) might do to increase revenue is simple: By delivering better search results, it would help to preserve nnd enhance Google's leadership, reputation, and market share.

flicker




msg:744672
 1:06 pm on Apr 11, 2006 (gmt 0)

>"You won't have to click and see if what you're after is in this webpage,
>and go back and forth again and again," he said

I think that's a FANTASTIC idea. The only websites that's going to disadvantage are spam websites. If you have something to sell, the surfer will be MORE likely to click through to buy it if the full description of the item or service shows on the SERPs. If you have an informational website, a surfer who's really interested in the topic will be MORE likely to click through with a longer excerpt showing the breadth of what's on your site. Among serious users (the ones who are likely to actually buy, donate, research, click ads, etc.), a longer description of the site will only help.

All this will do is make it more likely that users will click on 'real' sites instead of spam. That's good news for all real webmasters, and certainly for searchers. The only problem I perceive is that the SERP page is going to be longer, so sites in the #8 or #9 position may get even less attention than previously.

tulkul




msg:744673
 1:08 pm on Apr 11, 2006 (gmt 0)

As a Professional Librarian with a Masters in
Information Managment and Retrieval, I know that
many Libraries and Librarian Associations have been
complaining strongly to Google about the relevance
of a lot of their search results and the amount of
spam appearing

The last reply that we received was that Google is
in the process of implementing relevance to address
this issue.

Google will lose money if Library Patrons are
advised to switch to another engine as is the case
in the organisation I work with; 1 million search
queries a day (on a slow day) is not to be sneezed at;
loose the visitors because the information is not
relevant and you lose your advertising client base.

Its a simple statistic; as a Researcher, I don't have
the time to help clients wade throgh thousands
or irrelevent results or spam .. I'll suggest something
a little different.

What a lot of people lose sight of is that a very
high percentage of search engine queries are from
people who are relatively information illiterate.

This also goes for people who write the web pages.

Christopher Wynter PhD MIM

aleksl




msg:744674
 1:54 pm on Apr 11, 2006 (gmt 0)

flicker: I think that's a FANTASTIC idea. The only websites that's going to disadvantage are spam websites. If you have something to sell, the surfer will be MORE likely to click through ... a longer description of the site will only help.

Yeah, and who needs to click on these spammy ads on the right...oh wait, and Google will loose billions if THAT happens.

JuniorOptimizer




msg:744675
 2:00 pm on Apr 11, 2006 (gmt 0)

Christopher, I found that term "information illiterate" interesting. For increased clarity, please tell me how you define that term.

europeforvisitors




msg:744676
 2:00 pm on Apr 11, 2006 (gmt 0)

Yeah, and who needs to click on these spammy ads on the right...oh wait, and Google will loose billions if THAT happens.

What's "spammy" about paid advertising that doesn't clutter the organic search results?

SEOwebGuy




msg:744677
 2:10 pm on Apr 11, 2006 (gmt 0)

I just received word on this news this morning. Emails were in my email about the possibility of Google buying it and other emails saying it was done. To find out that there is six pages written here astounds me! By the time I write this another 2 will already have been written.

The new algorithm looks like it will work with keyword modifiers then do independent searches on those modifiers. So it looks like the larger web sites that offer many resources are the ones that will benefit from this new algo.

I hope so, because my web design site offers web design, hosting, domain registration, ssl certificates, etc. If that is the case with the new algo then building larger web sites that cover all areas of the 'keyword theme' by creating web pages on each 'keyword modifier' would seem to be the way to go.

If anyone knows of the new patent information let us know. I am going to contact someone who is known for scouring the patent office for the latest Google patents for a hobby. If I come across anything I will let you guys know.

Michael Rock

tulkul




msg:744678
 12:09 pm on Apr 11, 2006 (gmt 0)

If you have a look at the following article for reference
[unsw.edu.au...]

It was announced today by the University of New South Wales (Sydney Australia) in a Public Statement on
local Television News that Google had made a
substantial financial contribution to the Orion Progect and that the person who designed the algorithm and the software is now working for Google.

Amongst other things, this software ranks site according to the frequency of the Search Term in the Context only .. and relates it to related categories and subcategories, presenting the results as a choice of related clusters.

This software has the facility to look at a page for stuffing and eliminates all sites that are just harvested directories from the results.

It also has the capacity to rank on authority according to the "intelligence" of the context by being able to compare each result with all other results for the same pkrase search.

According to the Professor of the Department responsible, there are significant other features of this software in that the algorithm is able to adjust itself as the crawled sites build a more complete contextual reference picture of the 'search term'interms of all of the published information on that subject.

With this feature, it is going to become almost impossible for SEO to predict how site changes will affect position or ranking because the index will be self building in real time.

There was no word when Google was intending to roll out this new software, but the feeling was that it is highly possible it will be dropped on an unsuspecting world without prior announcement.

The only indication to the user will be that the search results page (and we were shown a screen shot) is laid out very differently to the present and shows its results as the search term in context with sub pages showing all of the categories in which the term appears - again, with the results shown in context.

Having followed this forum for a while, I can imagine that there will be a whole lot of speculation about how to manipulate it and what effects it will have -- even when it will appear.

Knowing the work of this University, the software will probably be pretty good (I've worked and studied there).

Also knowing how the 'system' works, this roll-out may be a lot closer than people think .. and I would suspect that a lot of the small changes we've been seeing in Google have actually been testing various aspects of the software and building up the index.

By profession, I am a Librarian with a Masters in Information Management and Information Retrieval. From this background and my experiences with Library Patrons and Researchers trying to use Google, I was able to recognise a lot of what was revealed in the interview.

I also know that there has been a lot of feedback to Google from Libraries and Educational Institutions about the relevance of search results. My last such correspondence from them suggested that "steps are currently underway to rationalise results according to relevance and main subject categories"

This may well mean that something like one of the Library Cataloging systems may be used as part of the new index, much as you would first index a book by title, author, publisher, broad category, category, subject sub-subject and then key words.

So .. gone maybe the days of "Page Rank" as it is known in its present form .. and gone may be the days of link relevance

Christopher Wynter PhD MIM

[edited by: tedster at 7:32 am (utc) on April 14, 2006]

flicker




msg:744679
 3:35 pm on Apr 11, 2006 (gmt 0)

aleksl, I don't think I've ever seen a spam website in Google's paid listings. Have you? Spam websites are cheap, mass-produced, and have no useful content. They live off of free SE results and people clicking on them by accident. Making it easier for searchers to avoid them benefits everybody: the searchers, webmasters of real sites, and Google.

Anyone who has a real website should welcome the idea of a longer excerpt from their site appearing on Google search. It means that much more chance to convince a user to click on your site. That much more description of your product to put them in the mood to buy it... that much more explanation of why you know what you're talking about and they should hire you... that much more demonstration of the depth of content your site offers. It's a great chance for the good sites to make themselves stand out over worse sites listed on the same page. And a good opportunity for SEO's to make themselves invaluable in helping customers with real websites put their best foot forward in an ethical but effective way, too!

jimphilli




msg:744680
 5:35 pm on Apr 11, 2006 (gmt 0)

The answer to what this (or other new search technologies) might do to increase revenue is simple: By delivering better search results, it would help to preserve nnd enhance Google's leadership, reputation, and market share.

I agree with that, but G has to be very careful to strike a balance between making a better site, and keeping their revenue stream intact. Because of this, I would be very surprised if they made any drastic sweeping changes without first testing them one at a time.

If it were MSN it would be a different story, they have much less to lose. But G is the market leader, and I think that for them to make changes slowly is far from foolish; it is the prudent way to go.

tulkul




msg:744681
 8:55 pm on Apr 11, 2006 (gmt 0)

It is wise to remember in context ...

the guy who invented and patented the Orion algorithm was an immigrant to Australia and, for him, English was a second language ... and the Culture was very different from his ethnic and religious origins.

"Junior Optimizer" asked me what I meant by the term "Information Literacy"

I agree .. its a very interesting and complex term and one which Google, itself, has now invested a lot of time and cash ... the psychology of the searcher

I used the phrase in terms of both "authors" and "seekers" of information.

My reply is in terms of what people come to me for .. informtion. Some may want to research "blue widgets" because they want to buy one; some may come to me for the same information because they want to find out the history of blue widgets; yet others may want to know the technical aspects of blue widgets.

So far, so good .. identify what the information seeker is looking for the information for.

Now .. is this researcher a 90 year old granny (and there are a lot of them using the internet) looking to see how a "blue widget" is going to assist them coping with incontinence .. or is this researcher a 15 year old doing a school project on "the history and development of blue widgets and their application in improving the lifestyle of 90 year old grannies" .. and there are a lot of these as well.

For the webmaster, author ... who is the target audience .. or .. are you just trying to make money on the sale of blue widgets?

Google is becoming more interested in a site that is going to be complete in providing the sort of information that is going to assist all demographs (crosssections) of people researching blue widgets. This is part of what they mean by the term "Authorative".

In other words, just cramming a site with the key words "blue widgets" will no longer have relevance over a site which covers all aspects of "blue widgetery"

These are just two possibilities; there are other demographs who want to find out how to get bulk quantities of "blue widgets" so they can supply them in their local area; yet another group wants to find out how they can advise little old ladies who come into their shop looking for information on "blue widgets".

Here we come into another area of website presentation; we conducted a survey of internet searchers using our own website and came up with a very interesting statistic ... a large number of people don't really understand what they are reading because of the words used. You DO NOT make yourself authorative on a subject by using words and language that only a "blue widget geek" would understand. You need to be able to write in "PLAIN ENGLISH" so that the highest percentage of people looking for the information will be drawn to your site.

PLAIN ENGLISH has another aspect to both the author and the webmaster .. and Google is recognising this.

Many of us take English for granted as we learned it as babies from parents who spoke English from parents who ...

BUT, in today's society, there are more than 50% of people for whom English is a second language. If you describe "blue widgets" in "blue widget geek centered language" so you seem like an expert in "blue widgets", what is your site going to tell an immigrant from lower Calathumpia who learned English from reading the travel guide or by conversing with the checkout chick down at the local supermarket?

AND then .. what if their are Religious implications concerning the application and use of "blue widgets" .. they might be ok in terms of the '#*$!-ers' but the 'bbb' people might be highly offended.

An .edu TLD is going to present "blue widgets" in a very different manner than a .com if I am looking for information on a drug, surgical procedure or clinical application.

As I handle an enquiry from a patron, I have to take all of these things into account when someone is standing in front of me looking for "blue widgets".

When I research 'blue widgets' for the patron, I will click on a link that is going to offer "blue widgets' in the context of the patron ... and it is not uncommon for me to drill down through 5 or six pages until a Google result provides me with the information in context.

Google is starting to recognise this. Their algorithm must change to reflect the information in the manner that humans are requesting it .. not in terms of the webmaster author who wants to trick the SE to get their site on the first result page.

In plain simple terms ... Google has become very conscious of the usage patterns and psychology of searchers because they KNOW that if they do not provide what people are looking for, someone else will and they will lose their audience and their base will go down and they will lose revenue because fewer and fewer people will see their advertisements.

Exactly the same strategy will also start to apply to advertising; it must fit into the context of selling INFORMATION.

Google (and all of the other SE's) are only in it because they can sell INFORMATION; they are not interested in selling "blue widgets" to other "blue widget" geeks who make up less that 0.1% of the people who are looking for information about 'blue widgets'

So, they are looking to sort out the information they provide relevant to the largest group of people who are typing "blue widgets' into the search box and categorising in terms of reference that I've outlined here ... and more.

This is what Orion is all about ...

activeco




msg:744682
 10:43 pm on Apr 11, 2006 (gmt 0)

What a lot of people lose sight of is that a very
high percentage of search engine queries are from
people who are relatively information illiterate.

Right.
And the world turns around those very same people.
Search engines are not exceptions. The search results are aimed at them.

P.S. I was too fast. I just read your last sentences which basically tell the same.

Kirby




msg:744683
 11:41 pm on Apr 11, 2006 (gmt 0)

Their algorithm must change to reflect the information in the manner that humans are requesting it .. not in terms of the webmaster author who wants to trick the SE to get their site on the first result page.

Very good point, Christopher.

Bank tellers used to be taught that the best way to detect counterfeit money was to know the real stuff so well that the fakes stood out.

If Google improves how they determine relevance, then the flip side is a reduction in spam.

This 77 message thread spans 3 pages: < < 77 ( 1 [2] 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved