| 2:45 pm on May 2, 2007 (gmt 0)|
|A system, comprising: means for identifying a document that appears as a search result document for a plurality of discordant search queries; means for determining a score for the document; means for negatively adjusting the score for the document; and means for ranking the document with regard to at least one other document based, at least in part, on the negatively-adjusted score. |
Okay, so that's what Google's patenting... let me ask you guys this:
Does this prevent other people from developing a system that does the exact same thing? Or just from developing a system that does the same thing as what they're doing in the same fashion that they're doing it?
In other words, can other companies use (and patent) their own proprietary algorithms to score documents in a similar manner?
| 2:51 pm on May 2, 2007 (gmt 0)|
I am waiting for the highly complicated patent that can fgrep -v the PHPSESSID out of urls .... so no one has duplicate incoming links. That's a hard one. Gonna need a few PhDs minimum ..
| 2:58 pm on May 2, 2007 (gmt 0)|
|In other words, can other companies use (and patent) their own proprietary algorithms to score documents in a similar manner? |
I love the one where they patented humn input into search results ... next one will be the use eyes into the search result...
| 6:24 pm on May 2, 2007 (gmt 0)|
|Does this prevent other people from developing a system that does the exact same thing |
i can't see why others can't use similar techniques. After all Latent Semantic Analysis is patented (and not to google, yahoo or MSN), and everyone seems to feel its okay to use "variants". Plus, who would know. Secret sauce and all that.
Anyway, how can you patent "ranking a site according to how much traffic it gets." Alexa has been doing that for years.
| 7:14 pm on May 2, 2007 (gmt 0)|
|"ranking a site according to how much traffic it gets." |
Did they mention the whole self fulfilling prophecy thing in that patent? If it's on page 1 it tends to get more traffic, more links more popularity. The whole art industry lives of someone telling others that green splatters in that variation are great..
Well i guess they are clever enough to somehow normalise their traffic out.
| 7:27 pm on May 2, 2007 (gmt 0)|
If a url ranks on page 1 and gets less traffic than the benchmark for such a position, it will fall. That works in the opposite direction. I think a factor like this is in play when brand new pages and even new domains get a honeymoon period of excellent rankings, only to fall quickly down.
| 7:32 pm on May 2, 2007 (gmt 0)|
That's what I ment :)
Yet what an effort to have a normal (hopefully a mean with variation) on each search term...
Would probably better to do return times adjusted by server speed.
| 7:58 pm on May 2, 2007 (gmt 0)|
|what an effort to have a normal (hopefully a mean with variation) on each search term... |
I'm pretty sure Google's already been doing that work for quite a few years, and that they keep it fresh, too. Probably not on "every" search term as a separate data point, but on many high volume searches -- and most likely classified by some system they evolved from significant parallels they noticed in their initial data set.
This kind of traffic data would also be important for them to measure user satisfaction with the SERPs and any time. If the current data after an algo tweak fell off from that generated by earlier benchmarks, that would be essential feedback.
| 10:23 pm on May 2, 2007 (gmt 0)|
On that topic what is an average return percentage as in measured in analytics? I have 30% in a relatively wide field which I thought is OK. I had worse in the past so the general trend was up. It was 25% on 85.000 people a day and 300.000 PI a day. I am not entirely sure how high that return rate has to be to be deemed a non spam experience. We are now on roughly 6 pages per visitors and that's on trusted users that come in directly. It was 5.6 on high traffic. Not sure what Google wants ... CTR decreased that was the only value that decreased. More return visitors get ad blinder to your spots. From these user related values only ctr and ecpm (which are obviously related) went down. So if the 950 on my site was user related it would more likely be linked to adsense then normal user satisfaction. Interestingly. CTR started falling 60% 2 weeks before the latest Google "freak event". I always wondered if Google having so many adwords to sell and haing continuing profit increases over 50% does not rank the SERPS according to demand. Summer and spring time are the times when students get less interested so I assume the ad buyers know that too.
Ads are 100% targeted still ..
I mean how can you not work after supply and demand ...?
| 8:18 pm on May 3, 2007 (gmt 0)|
when were these patents submitted? A lot of these ideas have been thrown around here at webmasterworld, for example in the "signs of crap" thread. There were lots of ideas proposed for evaluating site quality. I distinctly remember discussions of tracking queries and click rates against a standard, for example.
They can't patent methods that have been previously described in a public forum.
| 10:45 pm on May 3, 2007 (gmt 0)|
callivert, I think these new patents are not just about the factors being measured (many are old news, as you mentioned) but more about the specific techniques involved and the methods of combining factors.
(Yes, IMO the US needs patent law reform - copyright law reform as well.)
| 6:55 pm on May 4, 2007 (gmt 0)|
|2. Query Analysis: which search result gets the click |
Page titles and meta descriptions just got more important. Tip: write your title and description as if you were writing an AdWords ad.
| This 42 message thread spans 2 pages: < < 42 ( 1  ) |