Forum Moderators: open
I am formatting this suggestion using your preferrred method:
Fast and to the Point
Please consider this:
You have Google Toolbars collecting data constantly.
We know that you now know a lot about us. (WebmasterWorld users-See Alexa.com search results-click on the LOGO for each site for "details about each site"- this is what Google ALREADY knows about your site!)
Please, we beg you: USE THIS INFORMATION NOW!
You KNOW how LONG people stay on a site, how many pages they view, and where they go next...
Please factor this "stickiness" into your algo!
Think about it:
IF users click on Site A and stay 5 minutes and Site B and stay 5 seconds, shouldn't that tell you that Site A is a better match?
Wouldn't it make sense to add PR to sites that users find useful?
Wouldn't that cut down on spamming that leads users to useless pages that are simply optimized well?
Wouldn't that encourage webmasters and designers to create product that would be of interest to the users who would stick around?
Wouldn't that help new but BETTER sites that have premiered recently at least have a fighting chance to claw their way up the SERPs by being better?
Google-
It's time for the next step! Show the stickiest sites towards the top of the results pages!
You are, and have been miles ahead of everyone else.
But, isn't it time now for your next quantum leap?
We'll be waiting......
The reverse argument: An efficient site serving information will only need 1 single page to give surfers exactly what they are looking for. Google loves these sites, Google itself only gives 6.8 pages per user. Perhaps "sticky" sites are actually hard to navigate and require multiple clicks to get where you're going.
No.
Joe searches for "salsa" wanting to see a site about mexican salsas. Jane searches for "salsa" looking for a site about the dance. Joe enters a site that is about the dance and clicks his back button. Does that mean the site isn't useful for Jane?
I see certain keywords in your post, such as 'stikiness' and 'products', so I'll have to assume that perhaps you have a commercial site.
If not, my error.
At any rate, I don't think stickiness should be a part of ranking at all. One, stickiness can easily be falsified. Second, in no way does stickiness speak to quality of content, only to time the server was serving pages.
Pendanticist.
An efficient site serving information will only need 1 single page to give surfers exactly what they are looking for. Google loves these sites
Agree, I would say running up to maybe two or three.
How many visitors enter through your index page?
For my site on a typical day a third of my 2,500 pages are used as entry page.
Google itself only gives 6.8 pages per user
Interesting BGumble, can you tell me where this data comes from. Do you mean a typical Google visitor checks out 6.8 pages per user session?
Perfect example of why stickiness SHOULD play in to the equation.
The Google listing for the Salsa Hot Sauce site will not be clicked on by users looking for the dance...the site description and the scrape of the site content will tell them that this is the site about the condiment.
But if the site is a USELESS UGLY site about the dance that has NO CONTENT worth considering, and 70% of the users click the back button INSTANTLY when they see the site, doesn't that reveal that the site in question doesn't live up to what the user was expecting when he/she clicked?
ALL OF US have "dogs" in SERPs for our targeted search terms.
Some of these sites were created in the dinosaur age of the net.
Some of the sites we compete with look like they were done as part of a sixth grade computer class-but they are optimized, well linked and place so high that a new competitor feels overwhelmed!
These sites are taking up valuable space that might be replaced by a better user-friendly result.
RE this suggestion:
It's true that Google is currently counting links as "votes" from websites.
If these "votes" are part of the algo, why not measure actual Google user's responses-
whether or not they are finding what they want when they get to the site.
The sites that meet expectations should begin to climb. The "dogs" should be put to sleep.
Google has mastered the art of delivering the best search results. It's what keeps us coming back. Google user reaction is something else to ADD to their algo.
I am NOT suggesting that stickiness is ALL there is -but it SHOULD BE part of the Google algo!
>>How many visitors enter through your index page?
Actually only 20% come through my index page now. I have about 70,000 forum threads archived in G that have become inviting doorways for many different related site-topics since they are decently SEO'd.
>>Do you mean a typical Google visitor checks out 6.8 pages per user session?
That is straight from Alexa's more information:
[alexa.com...]
Not exactly. There are plenty of sites listed in the results out there that don't display enough information in the description to tell what the site is about.
And, everyone can't design their site just so Google can pull out enough information to fill the description so it is on target.
>>>>place so high that a new competitor feels overwhelmed!
Ok. Then what happens when there are 20 sites deemed "sticky" that have so much click-thru data on them they can't be budged from the top spots?
There are a lot of ugly sites out there that still offer great information. Just because ugly sites "in your eyes" might be doing better in the results doesn't mean they are less worthy than yours.
An option like what you are suggesting has a place in a search engine, but only if they give users the option to personalize their results to see it that way.
When comparing similar sites, the one where users stay longer and click more pages is LIKELY to be the more popular site for a reason. Not always true as we can see by many of the reverse arguments presented here.
Google could definately be considering some Google toolbar user statistics:
Published 9/5/02
"Methods and apparatus for employing usage statistics in document retrieval" [appft1.uspto.gov]
"Methods and apparatus consistent with the invention provide improved organization of documents responsive to a search query. In one embodiment, a search query is received and a list of responsive documents is identified. The responsive documents are organized based in whole or in part on usage statistics."
from this thread: [webmasterworld.com...]
although "time spent on a site" seems a bit way out.
I was searching for the altitude at the summit of mount washington. I spent a bunch of time on the official site looking for that information. If they had it, it was not real obvious.
I went to the next page in the SERPs. The answer was in the heading. I was on the site less than a second before my eyes picked out the answer I needed.
Which site was more useful?
The official site was prettier too.
I never spend more than about a minute on CIA world factbook pages, but I consider them to be very important.
I understand your point, I spend a lot of time on sites that I find really interesting. I just don't think that it is necessarily a good criteria for judging the usefulness of a page.
Right. Doing this would reward an inefficient site over an efficient one.
Kind of reminds me of some other recent ideas that are creditable but which would represent a major shift in direction for Google...
- parsing Javascript
- passing PR variably based on some kind of relevance criteria
- monitoring clickthroughs of search results
- porn-free SERPs
- minimizing/eliminating PR
- allowing "voting" for preferred websites
- incorporating gross traffic figures in the algorithm
- bonus points for valid HTML/XML
- etc. etc. etc.
My sessions usually average ~2.4 minutes per page. Does that make me sticky or not? I certainly am not generating page after page of click throughs.
Using your method, wouldn't people who opened a page then went to lunch radically skew the results? Or would my site be classified as a "lunch time" page view, and then punished because I have a 4000 word biographical summary of xyz celebrity?
I think it would be too hard to track the intent of the user. What about a site that purpose forced you to click 6 links to display the target content? Would that generate stickiness for the algo?
I remember that one of the now defunt search engines/ directories, can't remember the name, the one owned by NBC, they had a click through counter system for popularity.
Heh. "hey kids... want to earn a dollar? Type these different words into this search engine for an hour or so, and everytime you see www.penguinwidgetsales.com, click it, and only it."
Blam, instant top listings in 24 hours.
That's the kind of info the Toolbar is capturing that COULD BE USED in their ALGO!
In your example:
User enters "widgets"
Clicks to your site. Finds what he/she needs.
Done!
Continues to another site that is unrelated.
Or
Goes back to Google.
Enters search totally unrelated.
Google scores site well.
Met users needs.
Guest ended query on search term after visiting your site.
High score for you! You begin to ease up....
For your competitor,
Enters widgets.
Gets to your competitor's site.
Has the "yuck" reaction, clicks back button in 3 seconds.
Clicks on ANOTHER site from SAME SERP.
Toolbar reads this sequence to say that the search result was not what the user wanted, and your competition starts slipping down...
That makes sense, doesn't it? Best pages for the query are at the top of the results!
None of this is that complicated when compared with the algo now, is it?
SEO is great, and we use it, extensively. That's why we're here.
But shouldn't there be a "people's choice" factoring in play? Aren't the best sellers always at the front of the book store?
Then they go to Site B, they stay for 20 seconds because the "widget info" was plain and simple to find. It was the first thing they read, so they had no other reason to stay on the site.
According to the original poster's suggestion, Site A should now have a better ranking than Site B.
With this suggestion, the site whose information is hard to find would rank better than a site whose information is quick and simple to get to. One of many reasons that I am happy with whatever Google is doing now. No need for changes, at least not yet.
Stickyness: site content and organisation so fascinating that users stay longer than previously intended. Result: user buys more / other / more expensive things than intended. Site works well so far.
Slipperyness: site structure and content so badly organised that users stay longer than previously intended (BTW: better term available?). Result: user clicks around to find what he's looking for. Give up sooner or later.
How to tell the difference?
More concise: how to tell without asking users?
You can't!
The only people who even know what it is are people in the industry.
So, if Google were to use the tool bar to gather info for the algo, then the algo would now be influenced by SEO's rather than normal searchers. Thus, Google would quickly lose its great appeal.
I think they are doing a great job on the algo, other than not being able to deal with the tricky spam issue.
Your example, and ALL of these scenarios above can be monitored and translated by the Toolbar.
The Toolbar watches behaviour, and just like we could "stand over someone's shoulder" and watch users of our websites and learn from our watching, the Toolbar can monitor the patterns of users and attribute scores to their activities.
Everything mentioned above will work beautifully when the brilliant mathematicians that have created Google's algo work their magic on this concept.
All of this is coming, without a question. The earlier version of the Toolbar had a SMILE/FROWN Icon so users could vote...but I'm betting they're way beyond that now...and they've been harvesting all kinds of useful navigation patterns for all of our sites. They know our user's patterns.
I place well on Google, too (thanks to WebmasterWorld and my lurking here for a while) but I sure would like to include Customer Input into the equation.
I'm simply asking that those of us who dare to face the next step in SEO simply ask Google to move forward NOW!
Until you can separate sites displaying remarkable information efficiency vs. those with worthless listings (that both receive 1 page view per user), you won't be able to use the SERP clicks in the way you imagine. You cannot penalize sites that have the greatest efficiency of information. Just because I went back to the Google results and kept browsing does not mean I was unhappy with the 1-page of information, it means I still want to read more opinions.
Does anyone have the stats for the penetration of the Google toolbar? And, of those people what percent have the PR display enabled? IIRC that is off by default. There is reason to suspect that those people with the toolbar with PR display enabled may not be a representative sample of Google users. However, I think the bigger problem is that this idea favors inefficient sites. If all you know is that a person spent a short time on a site, you can't from that determine if they thought it sucked, or if they got exactly what they wanted quick.
Google can determine that, as an example, when this happens:
A user clicks off of a page and back to SERP within 5 seconds of page load
and then clicks on another SERP result
then there's an 85% possibility
that the user was not satisfied with the page.
They weigh the Negative Score accordingly.
None of this is easy. But Google has the experts to do this!