homepage Welcome to WebmasterWorld Guest from 23.22.2.150
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Marketing and Biz Dev / SEM Research Topics
Forum Library, Charter, Moderators: phranque

SEM Research Topics Forum

    
Usage Patterns from Toolbar Data
pros and cons
sean

10+ Year Member



 
Msg#: 706 posted 1:44 pm on Aug 30, 2004 (gmt 0)

What do people see as the limitations or challenges of mining toolbar data for new signals of quality?

 

sean

10+ Year Member



 
Msg#: 706 posted 2:14 pm on Aug 30, 2004 (gmt 0)

The first issue people raise is that that measuring unique visitors would make it very difficult for new sites to rise through the ranks.

Not necessarily. Staleness is major concern, as observed with DirectHit. However, there would be many different ways to measure use data, and most of of these measurements would have one or more counter-balances.

For example, total unique visitors would be an obvious measure, but why not also the growth rate of unique visitors? Then it is only a matter of tweaking the knobs back and forth until you get an ideal mix of old and new.

more later...

keeper

10+ Year Member



 
Msg#: 706 posted 9:23 am on Aug 31, 2004 (gmt 0)

Here are some initial benefits and risks that spring to mind:

Benefits:
1) Difficult to fake actual traffic. Even click bots would be difficult if search engine's source traffic directly from a toolbar with a private key or something similar (security ain't my area - sorry)
2) Data still available on a page by page basis.
3) I like your idea of growth rate too. Meaning that a site that gets press or other types of public exposure automatically gets a boost in relevance for its on page factors. Once the 'buzz' recedes, so does the boost.
4) Ability to treat this data as a seperate 'knob' or multiply with link pop to add sanity to the link pop score.

Risks:
1) Quality sites that do not get a lot of search traffic and/or do not advertise and/or do not have an existing large user base are inconvenienced to the factor that their "competition" are benefitted by their own traffic levels.
One could argue this may lead to the popular sites perpetually dominate SERPs.
2) Toolbar data may be argued to have a skew in its sample. Is the toolbar user a regular user? How will the use of this data effect the users who do not have toolbars installed? Are there any large differences in SERP expectation?
3) Leads onto the inherent Alexa type arguments, but in this case - the data is only one of 100 algo factors, so by and large should be beneficial to relevance.

I would bet the farm that Yahoo and Google are already trying out stuff like this.

nalin

10+ Year Member



 
Msg#: 706 posted 2:31 pm on Sep 3, 2004 (gmt 0)

...this may lead to the popular sites perpetually dominate SERPs.

A lack of this behavior post florida was exactly what so riled webmasters as a community ("and my quality relevant site which had topped the serps for years..."). Yes toolbar data makes the market very difficult to break into but it also ensures that if your site is the game in town - and you reside close enough to the top that visitors will visit - then statistically they can pick you more, speand more time on your site, etc, and your ranks will rise accordingly.

Your risk #2 is an extremly valid point - the demographics of toolbar users are not representitive of internet users in general (for instance there is no version of the toolbar available to the 10% of us who prefer linux or mac and associated broswers). Alexas top 100 [alexa.com] makes it painfully obvious that their stats favor those who will install anything - amoung the top 20 are several susspect sites including for instance, gator (aka claria, a spyware advertiser) and an ip address that hosts no content. The last thing the internet needs is spyware number one for say "desktop weather" or "wallet plugin" or anything remotly similar.

That said the thing that could differentiate google from alexa is factoring quality of traffic rather then quantity only. When the top alexa site has 16 page views on average and the top spyware has 1.8 (both stats gathered from alexa), your able to quickly gauge which one is providing a service and which one a disservice.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / SEM Research Topics
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved