homepage Welcome to WebmasterWorld Guest from 54.145.183.169
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 189 message thread spans 7 pages: < < 189 ( 1 2 [3] 4 5 6 7 > >     
New Google Patent Details Many Google Techniques
msgraph

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28814 posted 3:47 pm on Mar 31, 2005 (gmt 0)

Probably one of the best bits of information released by them in a patent.

Large number of inventors listed on here, even Matt Cutts that guy that attends those SE conferences. Explains a bit about what is already known through experience as well as comments made by search engine representatives.

Example:


[0039] Consider the example of a document with an inception date of yesterday that is referenced by 10 back links. This document may be scored higher by search engine 125 than a document with an inception date of 10 years ago that is referenced by 100 back links because the rate of link growth for the former is relatively higher than the latter. While a spiky rate of growth in the number of back links may be a factor used by search engine 125 to score documents, it may also signal an attempt to spam search engine 125. Accordingly, in this situation, search engine 125 may actually lower the score of a document(s) to reduce the effect of spamming.

USPTO version [appft1.uspto.gov]

< Note: the USPTO has at times either moved or removed this
patent. If that happens again, here's an online back-up copy:
Information retrieval based on historical data [webmasterwoman.com]>

[edited by: tedster at 3:04 am (utc) on April 10, 2008]

 

nuevojefe

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28814 posted 12:05 am on Apr 1, 2005 (gmt 0)

16. The method of claim 15, wherein the scoring the document includes assigning a higher score to the document when the document is selected more often than other documents in the set of search results over a time period.


How can this be done?
Is google going to implement 302 exit link counters?

But, are they tracking the clicks on SERPS already?

Yep. [webmasterworld.com...]

(3) the extent to which the advertisements generate user traffic to the documents to which they relate (e.g., their click-through rate). Search engine 125 may use these time-varying characteristics relating to advertising traffic to score the document.

I heard a few posts here relating this to AdSense CTR, but...

It seems they are referring to rating a page based on the amount of traffic it generates to it's (affiliate) advertisers links. That makes sense on some levels (spamming high traffic keywords with off-topic aff ads) but not at all on others (forums and such will have low CTR no matter the relevance of the aff ads because people don't want to leave via ads).

Excellent point. Another hole in the "can't be hurt by competitors" claim.

Lot of leaks to plug in that one...

Anything I missed?

Pretty comprensive there, SOD ;-)

On second hand impression it sounds just like a long "wish list" of sorts.

Yea, looks like the "you're dreaming" lists of button pushing tools I give to my programmers.

[edited by: nuevojefe at 12:11 am (utc) on April 1, 2005]

boredguru

10+ Year Member



 
Msg#: 28814 posted 12:07 am on Apr 1, 2005 (gmt 0)

Using the time-varying behavior of links to (and/or from) a document, search engine 125 may score the document accordingly. For example, a downward trend in the number or rate of new links (e.g., based on a comparison of the number or rate of new links in a recent time period versus an older time period) over time could signal to search engine 125 that a document is stale, in which case search engine 125 may decrease the document's score. Conversely, an upward trend may signal a "fresh" document (e.g., a document whose content is fresh--recently created or updated) that might be considered more relevant, depending on the particular situation and implementation.

Link Buyers beware. You cant go on increasing your links artificialy forever.

techrealm

10+ Year Member



 
Msg#: 28814 posted 12:21 am on Apr 1, 2005 (gmt 0)

I am in the April 1st crowd on this one...

It doesn't seems all that chunky - but then again they did actually did release gmail this time last year didnt they?

boredguru

10+ Year Member



 
Msg#: 28814 posted 12:28 am on Apr 1, 2005 (gmt 0)

Thats exactly what struck me when i looked at the date.

Seems like G must be having a few phd's in psychology too.

Lots of questions now.

dazzlindonna

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28814 posted 12:36 am on Apr 1, 2005 (gmt 0)

voltrader,

i have domains by proxy that have pagerank.

caveman

WebmasterWorld Senior Member caveman us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 28814 posted 12:45 am on Apr 1, 2005 (gmt 0)

> One interesting thing is that they seem to be looking more at the whole site and less at individual pages.

That has been true longer than people talk much about IMO ... but became for more apparent starting with Allegra. Applies to a surprising number of factors too, I think. ;-)

Rollo

10+ Year Member



 
Msg#: 28814 posted 12:49 am on Apr 1, 2005 (gmt 0)

Anyone with proxied Whois info with Pagerank >0?

No, this couldn't cause a PR 0. You are either too new or have been penalized I would guess.

Given the recent patent output by Google, I wouldn't say it has no effect either... Google *could* be using registry information for any number of things.

boredguru

10+ Year Member



 
Msg#: 28814 posted 12:53 am on Apr 1, 2005 (gmt 0)

One reason for such spikiness may be the addition of a large number of identical anchors from many documents. Another possibility may be the addition of deliberately different anchors from a lot of documents.

SEO is going to be walking the middle with tanks whizzing past on either sides from now on.

I thought of playing stocks, but decided that i dont like gambling and there is no real merit in it (except for financial merit).
Its the same here now. You win today, I lose today. I win tommorrow, you lose tommorrow. Ultimately the money we make wont be because of our productivity, but because of our gamble and pure fineese playing the game here if your idea is not novel.

I pity everyone. Especially, the ones entering now with a pocket full of cash and a bucket full of dreams.

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 28814 posted 1:07 am on Apr 1, 2005 (gmt 0)

"One reason for such spikiness may be the addition of a large number of identical anchors from many documents. Another possibility may be the addition of deliberately different anchors from a lot of documents."

I'm glad to see that one spelled out. It may save some poor electrons from dying in those artificially-vary-your-anchor-text threads. Whether Google is any good at this or not, the point is the "spikiness" not either the identicalness or the non-identicalness.

Interesting document, the whole spectrum from sane to claptrap... heh, just like Google.

graywolf

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28814 posted 1:36 am on Apr 1, 2005 (gmt 0)

For the people who think this is an Aprils Fools day joke I'll throw out the following:

  • Google's April fools day jokes are lame,ie pidgen rank and work on the moon.

  • The document comes from a dubdomain of the US Patent and Trademark Office, not the kind of folks who typically participate in April Fools day jokes, or who would want to set a precedence of participating in an April Fools day joke.
  • BigJay

    10+ Year Member



     
    Msg#: 28814 posted 1:53 am on Apr 1, 2005 (gmt 0)

    Where did they get Search Engine 125?

    anyway searchengine125.com is available for any takers.

    Question: In what way would they disclose such information to the public?

    Is google inc mentioned in the patent application? I did not see it in the header.

    Vadim

    10+ Year Member



     
    Msg#: 28814 posted 1:58 am on Apr 1, 2005 (gmt 0)

    It is certainly just a wish list compiled for legal protection. They simply gathered all that have a slightest chance to be implemented into their real algo.

    However if they are really are going to implement *all* this, they are crazy and will go out of business very quickly.

    It will happen not because the criteria are wrong but because there are too many of them. The optimization relative too many criteria never works just because too many criteria means that most of them have a little relevance.

    So we still have to figure out what they really use from this list.

    However there are some invariants.

    For Google the relevance of their result should be a king or they go out of business. Since, for example, the periodicity of the domain name registration has a little correlation with the relevance of the search results, I believe we may safely continue to register the domain once a year.

    For webmasters content is still a king and I believe will be king forever. If Goggle stops to find the good content independently of the domain registration frequency it will loose the competitions with other search engines.

    Vadim.

    graywolf

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 28814 posted 2:16 am on Apr 1, 2005 (gmt 0)

    Where did they get Search Engine 125?

    It comes from the Exemplary Search Engine section under sub section 0030 in figure 3. 125 is the designation given to the search engine in the diagram.

    fischermx

    WebmasterWorld Senior Member 5+ Year Member



     
    Msg#: 28814 posted 2:37 am on Apr 1, 2005 (gmt 0)


    Where did they get Search Engine 125?

    They have a list of search engines, numbered from 1 to 500, it happened the google took the number 125.


    anyway searchengine125.com is available for any takers.

    It is already taken by google and it is covered by "Invisible whois proxy" a new type of domain whois protection that is activated only at the moment that you attempt to register the domain. When you are just looking at the whois information it seems like available, but it is not.


    Question: In what way would they disclose such information to the public?

    By typing it into html and the uploading it in the us patents web site.


    Is google inc mentioned in the patent application? I did not see it in the header.

    It is there, just they used a white color font to hidden it, ya know, google knows all this tricks yet.

    Kirby

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 28814 posted 3:00 am on Apr 1, 2005 (gmt 0)

    For Google the relevance of their result should be a king or they go out of business.

    Google is way more than just a mere search engine. They are aggregators of information. The sheer depth and breadth of the info they possess is as valuable as their worth as a mere search engine.

    The patent covers a lot of ground. If nothing else, it gives them cover for SK type issues. With that laundry list, they can justify just about anything in their results.

    I wonder how many new backlinks Amazon will get this week as webmasters chase willy nilly after the holy grail? Will the spikiness hurt Amazon? ;)

    steveb

    WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



     
    Msg#: 28814 posted 3:33 am on Apr 1, 2005 (gmt 0)

    Reading it more closely, I'd say much more sane than claptrap, although as noted the idea of linking to Amazon seems a rather suspect signal of quality.

    A few notes (hey GG, get a pencil)...

    - I've had links to news pages on my main pages that link with a date as aprt of the link; my reading is that this is seen as bad (which of coourse makes no sense logically) but I've removed the dates from the links.

    - I shouldn't ignore every garbage link spam request as "get new links constantly, in ever increasing numbers" seems valued

    - my goodness, sending your links to gmail users (presumably in a non-spam way) is actually something valued. Sorry Google, but that doesn't follow. (Same with hurling your URL into Google Groups via signatures, etc.)

    - going to run off and renew my domains for ten years now; the weirdo seo tactic of 2005

    - nowhere is there even a hint of "we understand that new documents will very often immediately get large numbers of links from related documents" (new pages on domains are often included in sitewide navigation). This omission seems weird.

    - get backlinks gradually, not necessarily naturally

    - if you use third party contextual ads, use Google's competitors if the contextual ads generated link to areas of commercial spam; use Adsense if you get served ads to mainstream stuff (the revered Amazon...)

    0093... I've seen pages that have been hijacked but have reappeared rank significantly lower and not regain their prior-to-hijack rank; this could explain it, unfortunately. If you lose rank through no fault of your own, it will be harder to get rank back

    0114... heh, coming soon, the Google browser

    0133... anybody sure they know what this means?

    <I ain't saying any of thse make sense or are right, just the notes I made; I do think most of it made sense for Google to do, but in some cases, they seem to be not doing well with some aspects of its implementation, most obviously the blanket nature of the sandbox>

    Rugles

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 28814 posted 3:46 am on Apr 1, 2005 (gmt 0)

    How is it they are detecting the bookmarks?

    Any other ways than the toolbar?

    I think I am missing something.

    graywolf

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 28814 posted 4:03 am on Apr 1, 2005 (gmt 0)

    How is it they are detecting the bookmarks?

    Google desktop search could read through the folder and "phone home" with the data periodically.

    bears5122

    10+ Year Member



     
    Msg#: 28814 posted 4:18 am on Apr 1, 2005 (gmt 0)

    Wonder what else they are "phoning in" from our computers.

    stuntdubl

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 28814 posted 4:34 am on Apr 1, 2005 (gmt 0)

    detecting bookmarks
    Just a small bit of info to "send home" too by the same simple methods webmasters can use...the number of requests for favicon.ico
    Rugles

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 28814 posted 4:35 am on Apr 1, 2005 (gmt 0)

    So we can expect to see a wave of viruses that will be inserting websites into bookmark/favorites folder.

    beren

    10+ Year Member



     
    Msg#: 28814 posted 4:38 am on Apr 1, 2005 (gmt 0)

    Aside from the news about the length of domain registration time and value of bookmarks, I don't see anything much that's surprising here. It basically reinforces the conventional wisdom around Webmasterworld: build good sites and you'll get good rankings.

    The news that Google may punish sites that have a sudden upswing in links shouldn't be surprising. I would hope that they already do this. How else to discount sites that pay for links? They need to apply this discount more vigorously and more frequently because sites with paid links are too often showing up on the first page.

    shri

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 28814 posted 5:05 am on Apr 1, 2005 (gmt 0)

    A couple of points.

    -- This is an application. I am surprised with the broad nature of this application. Very surprised...

    -- I think (IANAL) a lot of it might be subject to prior art, specially since these techniques are fairly generic and have been discussed ad nauseum in several public areas.

    Frankly speaking, I am surprised they forgot to mention that spammy websites are likely to not have a customized favicon and common favicon's across sites can reveal networks.

    yankee

    10+ Year Member



     
    Msg#: 28814 posted 5:11 am on Apr 1, 2005 (gmt 0)

    Just checked a few serps. Many sites in the top three listings are only registered for one year.

    FourDegreez

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 28814 posted 5:51 am on Apr 1, 2005 (gmt 0)

    Just imagine the computing power that would be needed to implement some of that stuff...

    graywolf

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 28814 posted 5:53 am on Apr 1, 2005 (gmt 0)

    Just checked a few serps. Many sites in the top three listings are only registered for one year.

    I don't think any one thing like a short registration is going to penalize you. The problem would come in if you have a series of things that look "less than desireable" from the algo's point of view.

    blogger



     
    Msg#: 28814 posted 6:14 am on Apr 1, 2005 (gmt 0)

    >The news that Google may punish sites that have a sudden upswing in links shouldn't be surprising. I would hope that they already do this.

    Many bloggers get such upswings to particular documents, such as via an "Instalanche" that causes other bloggers to link to the recipient of the Instalanche. What google apparently thinks is natural for "normal" websites is not so natural for bloggers.

    digitalghost

    WebmasterWorld Senior Member digitalghost us a WebmasterWorld Top Contributor of All Time 10+ Year Member



     
    Msg#: 28814 posted 6:14 am on Apr 1, 2005 (gmt 0)

    >> Just imagine the computing power that would be needed to implement some of that stuff.

    If you can conceive it, it exists. ;)

    graywolf

    WebmasterWorld Senior Member 10+ Year Member



     
    Msg#: 28814 posted 6:30 am on Apr 1, 2005 (gmt 0)

    Many bloggers get such upswings to particular documents, such as via an "Instalanche" that causes other bloggers to link to the recipient of the Instalanche. What google apparently thinks is natural for "normal" websites is not so natural for bloggers.

    A link bloom acompanied by a search query bloom for a keyword or news item would be an expected thing. A link bloom with a stagnant amount of search queries would look manipulative.

    Yahoo keeps track of those things and publishes them weekly in the yahoo buzz report. It's a reasonable assumption that G is also tracking that information.

    jdMorgan

    WebmasterWorld Senior Member jdmorgan us a WebmasterWorld Top Contributor of All Time 10+ Year Member



     
    Msg#: 28814 posted 6:30 am on Apr 1, 2005 (gmt 0)

    > by patenting eveything imaginable
    > I am surprised with the broad nature

    The key to a comprehensive patent is to attempt to do almost that -- patent everything.

    As a designer, I was appalled by a patent application that I reviewed. In my view, it was so broad that the meaning of what was novel and useful was almost lost in the generalizations.

    When I spoke w/the head patent attorney on the project, he explained that it's better to "cast a wide net" at first, and then refine the patent and reduce the scope of the claims that are deemed "too broad" and are rejected by the patent office.

    This is a patent application, not an issued patent. So don't be surprised if the claims are very wide and/or vague. The patents "game" is to patent what you invent, but also what you haven't yet invented -- by casting the widest net that the patent reviewers will allow. It's bad to have a patent that is too narrow because it is easy for a competitor to work around it. And one that is too broad is too easy to challenge. I perceived that the patent reviewers took both into consideration; Sometimes when they reject a claim as too broad, they are actually doing you a favor.

    Such applications will generally go through two to six revisions, at which point everyone involved gets so tired of re-reading them that they will either converge on acceptable claims at that point, or abandon the effort.

    It should be interesting to see how this goes, and what the final accepted version looks like.

    Jim

    is300

    10+ Year Member



     
    Msg#: 28814 posted 6:48 am on Apr 1, 2005 (gmt 0)

    Looks like we can finally replace Brett's article on how to have success with google in 12 months.

    [webmasterworld.com...]

    This 189 message thread spans 7 pages: < < 189 ( 1 2 [3] 4 5 6 7 > >
    Global Options:
     top home search open messages active posts  
     

    Home / Forums Index / Google / Google SEO News and Discussion
    rss feed

    All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
    Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
    WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
    © Webmaster World 1996-2014 all rights reserved