| 8:00 pm on Feb 25, 2003 (gmt 0)|
>>>paragraph which states that all his inventions belong to Google
Well then that means it is theirs then doesn't it?
If what he says isn't directly related to Google, that doesn't mean it can't be applied to their results. This patent calls for the reranking of results based on query dependency. Which can mean "Take a Google query and rerank it"
| 9:20 pm on Feb 25, 2003 (gmt 0)|
So, what is the consensus? What should we be looking at in addition to all the other factors we deal with? Local Rank. How do we adjust to it the best way? Any thoughts? Paragraph 0037 seems to indicate that if a page has received a significant increase in visitors during a given period of time, like a week, then that page will rise to the top of the reshuffled pages. Is that how you see it? What can we do to affect LR in a positive way, please?
| 12:03 am on Feb 26, 2003 (gmt 0)|
SEO will become more complicated if this patent is implemented ( i dont think its in Google yet)
So basically for getting into the top 10 ranking you should have control over the linking of atleast 10-20 sites in the top 100 sites . So for pushing one site you should own 10-20 sites in the same subject all with "traditional" high PR and also hosted separately ...
I can clearly see sophisticated linking empires emerging in future :)
| 3:35 am on Feb 26, 2003 (gmt 0)|
I think it only becomes more complicated if you've been collecting links with a buck shot mentallity. For those of us who have been focusing on truely relevant keyword links all along, It only puts us ahead of the curve for once...
The only thing that I'm trying to get a handle on is, would this make topical reciprocal links of the highest stature? I would think that that is sort of anti-web and serves to eliminate the branching out that I would assume would make for a more natural linking method.
| 4:37 am on Feb 26, 2003 (gmt 0)|
OK, I get what Gopi is saying! I already advised my client to purchase independently hosted new sites with the same type of business. And any new clients I get will be required to begin with 10 sites not one. If they say no, out the door they go. Gopi is right. To stay at the front is going to require what Gopi is saying IF Google implements the patent. No mirror sites for me. Each will have to be completely independent. If this patent LR deal is implemented, I want to be ready. If customers do not understand, I do not want them. This is going to be really tough if this comes down the pike.
| 4:52 am on Feb 26, 2003 (gmt 0)|
"If customers do not understand, I do not want them. "
Send them to me, I'll take them.
| 2:15 pm on Feb 26, 2003 (gmt 0)|
Ha, yes, you can have the ones who sit here for 2 and 1/2 hours wanting to change the color of the print from one shade of blue to "just a tiny bit lighter, please..." They doubtless will fail to understand the new patent IF Google decides to use it... I have no time for such people. Glad you do!
| 2:38 pm on Feb 26, 2003 (gmt 0)|
Lets forget customers for a while ;)
Back to the patent and its workings, never mind when the patent was filed, never mind if it can be done on the fly computationally, or is done beforehand on a limited scale for the top used search queries, there is some sense to this stuff.
I just really wonder how they would weed out the artificial reciprocal linkage stuff.
I would start with discounting links from links/resources pages.
| 2:55 pm on Feb 26, 2003 (gmt 0)|
This could explain a thing or two.
From the point of view of niche ISP, such a system would be a nightmare. Hosting companies use a limited amount of class c networks. If the ISP is marketing itīs services to, say, "Widgets" Producers, and gets a fairly big chunck of the market, this system may end up sending most of itīs customers to Google irelevancy: Only one site will show, while the rest of competitors, sitting in the same class C network, and unaware of the problem, would we mistakenly flaged as "same host or an affiliated", and consecuentialy, delisted.
Iīm afraid niche ISP may have become colateral damage in Google's war against spam.
| 3:09 pm on Feb 26, 2003 (gmt 0)|
They can be pretty liberal with how the choose to set up the filter as it's just a part of the ranking picture and is designed to increase the relevance of a page based upon another relevant page's vote for it. There's no penalty for crosslinking, there's just no bonus.
It would be easy and effective to just look at it this way:
Page A comes up in the subset of B but no others.
Page B comes up in the subset of A but no others.
Therefore, page A and B are crosslinked and they are each removed from each other's LR calculation. (This could be easily done over a range of many sites that all link to each other, but not others that are in the main set.)
Bear in mind that even though they've been using this since November or December of last year, no one has really optimized for it. Since there's no way of visually verifying exactly what the LR factor is for any page on any search they could very will do many things that people wouldn't even suspect or notice. (i.e. ANY pages that appear in each other's subset don't get counted, or whatever).
NOTE: The patent (according to image 3) says that the filtering is done only on an IP level. That could very well be the case, but I doubt it. The patent outlines the technology's foundation, not necessarily the current implementation of it.
| 3:31 pm on Feb 26, 2003 (gmt 0)|
> if a page has received a significant increase in visitors during a given period of time, like a week, then that page will rise to the top of the reshuffled pages.
So how are visitors counted? Surely the only thing Google can check is click-through from Google. I doubt they would be allowed to check your web-logs.
| 4:00 pm on Feb 26, 2003 (gmt 0)|
Counting visitors using your weblog is no option. Just a guess: Google toolbar with 'Advanced Features' (toolbar PageRank) enabled .
| This 42 message thread spans 2 pages: < < 42 ( 1  ) |