homepage Welcome to WebmasterWorld Guest from 54.197.110.151
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Google's patent article
plus programming myth
kalthoff



 
Msg#: 32034 posted 6:52 pm on Nov 14, 2005 (gmt 0)

I know I'm a little late but I have some thoughts about the Google Patent article from earlier this year that really bug my mind.

1. In the patent article it is suggested that pages that are on top are regarded more popular than not so highly ranked pages due to increased traffic. Wouldn't that be a reversed vicious circle (a positive upward circle so to say), since highly ranked pages naturally get more traffic than others?

In this context the possibility of Google analysing the logfiles of all/ some pages is mentioned? Wouldn't that devastate Google capacities if they'd start analysing logfiles of millions of webpages?

Wouldn't that also be easy to manipulate? I'm sure webmasters won't have any trouble programming tools that generate tons of traffic.

2. Concerning the aging delay discussions: I have a four week old webpage that's on the first page for a keyword with 100.000 search results. Wouldn't that contradict the aging delay theory (new sites not getting good results for competitive keywords for the first 9-12 months)?

3. An SEO myth I start hearing more and more: Content in tables might not be so perfect for Google anymore but that sites programmed in e.g. style sheets make the content turn out to be a bit more relevant (because you can make the page still look good without having to use tables)?

Looking foreward to some words of wisdom to these thought?

 

AcsCh

10+ Year Member



 
Msg#: 32034 posted 10:11 pm on Nov 14, 2005 (gmt 0)

1) No, it should be easy for google, to "know" how much traffic a #1 ranking should get, compared to a #2 or 3 or.. IF #1 but little traffic (for being a #1), it's probably a "bad" site for this KW. (Logfiles? no need, they have the Toolbar to know EVERYTHING ;) )

2) 4 Weeks could be the freshie bonus. Wait another 2 weeks. ;)

3) The more text in comparison to HTML tags the better. css gives cleaner code than tables, and more logically structured.

Nikke

10+ Year Member



 
Msg#: 32034 posted 11:09 pm on Nov 14, 2005 (gmt 0)

And now they have Urchin, or rather Google Analytics.

I for one, kind of hope that Google don't have the horsepower to do a collective analysis of all this data.

europeforvisitors



 
Msg#: 32034 posted 11:40 pm on Nov 14, 2005 (gmt 0)

An SEO myth I start hearing more and more: Content in tables might not be so perfect for Google anymore but that sites programmed in e.g. style sheets make the content turn out to be a bit more relevant (because you can make the page still look good without having to use tables)?

That does sound like a myth, if only because search engines index content, not code.

SFReader

5+ Year Member



 
Msg#: 32034 posted 6:00 pm on Nov 17, 2005 (gmt 0)

Nikke,

Matt Cutts said they are not using the data at all, and he seemed honest and sincere.

trimmer80

10+ Year Member



 
Msg#: 32034 posted 3:41 am on Nov 18, 2005 (gmt 0)

An SEO myth I start hearing more and more: Content in tables might not be so perfect for Google anymore but that sites programmed in e.g. style sheets make the content turn out to be a bit more relevant (because you can make the page still look good without having to use tables)?

search engines remove all html tags. They give different weightings to different tags but in the end it is all text so thus in that respect this is a myth.... BUT using css instead of tables will give better control over the order / structure of the content which is an important benefit imho.

For example a page with using tables could lose meaning if structured incorrectly.
eg.
"<tr><td> product 1 title</td><td>product 2 title</td></tr>
<tr><td> product 1 description</td><td>product 2 description</td></tr>"

after removing the html this will be read to the search engine as

"product 1 title product 2 title product 1 description product 2 description"

Thus the meaning of product 1 and 2 are confused.

however with ccs you can have better control.

I know that you can get around these issues if you carefully layout the tables but there is a lot to be said for the simplicity of css.

newkid2005

5+ Year Member



 
Msg#: 32034 posted 3:51 am on Nov 18, 2005 (gmt 0)

Are you designing you website for real users or for search engines?

This is the only question to care about!

Make a website worthwhile visiting and the SE will follow!

aeiouy

5+ Year Member



 
Msg#: 32034 posted 4:19 am on Nov 18, 2005 (gmt 0)

Yeah as someone mentioned it could all be relative. Plus they could discount serps traffic etc and just measure traffic from other sources for example and quality of traffic as well.

I think such measurements would/will provide for much better ratings over time.

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 32034 posted 5:43 am on Nov 18, 2005 (gmt 0)

>>Make a website worthwhile visiting and the SE will follow!

a.k.a "Just build it and they will come." ;)

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 32034 posted 6:05 am on Nov 18, 2005 (gmt 0)

kaltoff, is this the patent you're referring to, the one linked to in this thread?

Google Patent: Using Usage Statistics in Search [webmasterworld.com]

If so, notice Import Export's comments at the end of that thread about tracking and A/B testing.

Added:

Concerning the aging delay discussions: I have a four week old webpage that's on the first page for a keyword with 100.000 search results. Wouldn't that contradict the aging delay theory (new sites not getting good results for competitive keywords for the first 9-12 months)?

Aside from that it be the new site honeymoon, I've got a self-created theory that usage & traffic may have something to do with why some sites avoid the sandbox effect altogether.

Going by what I can gather from posts by such members here, it seems that they've been by webmasters with existing, older sites that are very heavily trafficked and if they link to new ones from those, it could start new sites out with heavy traffic to begin with.

[edited by: Marcia at 6:13 am (utc) on Nov. 18, 2005]

ronburk

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 32034 posted 6:09 am on Nov 18, 2005 (gmt 0)

Wouldn't that be a reversed vicious circle

Only if Google engineers are really dense. If not, they could arrange to do things like only inspect traffic for SERPs that are page 2 or lower to see if they possibly merit a boost. That would have the potential effect of pushing people off of page 1 that might be offering inferior results to searchers.

I have multiple times seen a URL go to the "plausibly viewable range" (e.g., pages 3-10) and then directly pop from there to page 1. My belief is that these tended to be examples where my URL was a pearl of detailed, complete information swimming in a sea of not-too-helpful-to-users competing pages. I suspect that when Google sees a statistically significant number of searchers going all the way to, say, page 5 to find the answer they need, some ranking significance is attached to that.

kalthoff



 
Msg#: 32034 posted 8:23 am on Nov 23, 2005 (gmt 0)

Marcia, the statement you are referring to is pretty close to ideas of the patent article i read. I was referring to the Google article itself [appft1.uspto.gov...]

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved