homepage Welcome to WebmasterWorld Guest from 54.226.230.76
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google Finance, Govt, Policy and Business Issues
Forum Library, Charter, Moderators: goodroi

Google Finance, Govt, Policy and Business Issues Forum

This 42 message thread spans 2 pages: < < 42 ( 1 [2]     
New Google Patent - About reranking results
Another WebmasterWorld Exclusive!
msgraph




msg:1232430
 1:40 pm on Feb 25, 2003 (gmt 0)

Ranking search results by reranking the results based on local inter-connectivity [patft.uspto.gov]

Inventors: Krishna Bharat [searchwell.com]
Assignee: Google, Inc.

A re-ranking component in the search engine then refines the initially returned document rankings so that documents that are frequently cited in the initial set of relevant documents are preferred over documents that are less frequently cited within the initial set.

 

msgraph




msg:1232460
 8:00 pm on Feb 25, 2003 (gmt 0)

>>>paragraph which states that all his inventions belong to Google

Well then that means it is theirs then doesn't it?

If what he says isn't directly related to Google, that doesn't mean it can't be applied to their results. This patent calls for the reranking of results based on query dependency. Which can mean "Take a Google query and rerank it"

xerxes




msg:1232461
 9:20 pm on Feb 25, 2003 (gmt 0)

So, what is the consensus? What should we be looking at in addition to all the other factors we deal with? Local Rank. How do we adjust to it the best way? Any thoughts? Paragraph 0037 seems to indicate that if a page has received a significant increase in visitors during a given period of time, like a week, then that page will rise to the top of the reshuffled pages. Is that how you see it? What can we do to affect LR in a positive way, please?

gopi




msg:1232462
 12:03 am on Feb 26, 2003 (gmt 0)

SEO will become more complicated if this patent is implemented ( i dont think its in Google yet)

So basically for getting into the top 10 ranking you should have control over the linking of atleast 10-20 sites in the top 100 sites . So for pushing one site you should own 10-20 sites in the same subject all with "traditional" high PR and also hosted separately ...

I can clearly see sophisticated linking empires emerging in future :)

mat_bastian




msg:1232463
 3:35 am on Feb 26, 2003 (gmt 0)

I think it only becomes more complicated if you've been collecting links with a buck shot mentallity. For those of us who have been focusing on truely relevant keyword links all along, It only puts us ahead of the curve for once...

The only thing that I'm trying to get a handle on is, would this make topical reciprocal links of the highest stature? I would think that that is sort of anti-web and serves to eliminate the branching out that I would assume would make for a more natural linking method.

xerxes




msg:1232464
 4:37 am on Feb 26, 2003 (gmt 0)

OK, I get what Gopi is saying! I already advised my client to purchase independently hosted new sites with the same type of business. And any new clients I get will be required to begin with 10 sites not one. If they say no, out the door they go. Gopi is right. To stay at the front is going to require what Gopi is saying IF Google implements the patent. No mirror sites for me. Each will have to be completely independent. If this patent LR deal is implemented, I want to be ready. If customers do not understand, I do not want them. This is going to be really tough if this comes down the pike.

yankee




msg:1232465
 4:52 am on Feb 26, 2003 (gmt 0)

"If customers do not understand, I do not want them. "

Send them to me, I'll take them.

xerxes




msg:1232466
 2:15 pm on Feb 26, 2003 (gmt 0)

Ha, yes, you can have the ones who sit here for 2 and 1/2 hours wanting to change the color of the print from one shade of blue to "just a tiny bit lighter, please..." They doubtless will fail to understand the new patent IF Google decides to use it... I have no time for such people. Glad you do!

vitaplease




msg:1232467
 2:38 pm on Feb 26, 2003 (gmt 0)

Lets forget customers for a while ;)

Back to the patent and its workings, never mind when the patent was filed, never mind if it can be done on the fly computationally, or is done beforehand on a limited scale for the top used search queries, there is some sense to this stuff.

I just really wonder how they would weed out the artificial reciprocal linkage stuff.

I would start with discounting links from links/resources pages.

Marcos




msg:1232468
 2:55 pm on Feb 26, 2003 (gmt 0)

This could explain a thing or two.

From the point of view of niche ISP, such a system would be a nightmare. Hosting companies use a limited amount of class c networks. If the ISP is marketing itīs services to, say, "Widgets" Producers, and gets a fairly big chunck of the market, this system may end up sending most of itīs customers to Google irelevancy: Only one site will show, while the rest of competitors, sitting in the same class C network, and unaware of the problem, would we mistakenly flaged as "same host or an affiliated", and consecuentialy, delisted.

Iīm afraid niche ISP may have become colateral damage in Google's war against spam.

Grumpus




msg:1232469
 3:09 pm on Feb 26, 2003 (gmt 0)

They can be pretty liberal with how the choose to set up the filter as it's just a part of the ranking picture and is designed to increase the relevance of a page based upon another relevant page's vote for it. There's no penalty for crosslinking, there's just no bonus.

It would be easy and effective to just look at it this way:

Page A comes up in the subset of B but no others.
Page B comes up in the subset of A but no others.

Therefore, page A and B are crosslinked and they are each removed from each other's LR calculation. (This could be easily done over a range of many sites that all link to each other, but not others that are in the main set.)

Bear in mind that even though they've been using this since November or December of last year, no one has really optimized for it. Since there's no way of visually verifying exactly what the LR factor is for any page on any search they could very will do many things that people wouldn't even suspect or notice. (i.e. ANY pages that appear in each other's subset don't get counted, or whatever).

NOTE: The patent (according to image 3) says that the filtering is done only on an IP level. That could very well be the case, but I doubt it. The patent outlines the technology's foundation, not necessarily the current implementation of it.

G.

kapow




msg:1232470
 3:31 pm on Feb 26, 2003 (gmt 0)

> if a page has received a significant increase in visitors during a given period of time, like a week, then that page will rise to the top of the reshuffled pages.

So how are visitors counted? Surely the only thing Google can check is click-through from Google. I doubt they would be allowed to check your web-logs.

takagi




msg:1232471
 4:00 pm on Feb 26, 2003 (gmt 0)

Counting visitors using your weblog is no option. Just a guess: Google toolbar with 'Advanced Features' (toolbar PageRank) enabled .

It would not be a violation of the 'Toolbar Privacy Policy' if Google would use the information for reranking.

This 42 message thread spans 2 pages: < < 42 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google Finance, Govt, Policy and Business Issues
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved