Forum Moderators: open
Or am i missing your point here ...?
old, old thread on bail time [webmasterworld.com], but it still makes a few points.
Back OT, the toolbar would give an excellent value of how much time webmasters spend on a page. Of course we can all be trusted.;)
Dont count the seconds spent AT the site, I agree it could be easily abused.
Exactly. I guess Toolbar's best source of intelligence would be repeat visit detection.
Now this could be _very_ handy when it comes to improving Google in the commercial search space.
For a website providing informational content there is only a small chance of you returning to a page - you've discovered the fact you were looking for and it is now committed to memory.
But for a website providing a product or service...
For a website providing informational content there is only a small chance of you returning to a page - you've discovered the fact you were looking for and it is now committed to memory.
It would work well if it counted the time NOT spent at a particular site, just reverse the calculation.
If less than 5 minutes then check the time.
If clicked back after 10 seconds=no relevancy
If clicked back after 60 seconds=slight relevancy
If clicked back after 2 minutes=medium relevancy, some interest.
What about those of us that set clicked results to open in a new window?
What about my directory whose entire purpose is to land you on the right page and get you off to one of my client's sites?
For a website providing informational content there is only a small chance of you returning to a page - you've discovered the fact you were looking for and it is now committed to memory.
BigDave..... Do you spend 10 minutes on a page just to find out its relevancy? I can usually tell within 30seconds at most whether a page is relevent or not. If its not, I'm straight back to google's index.
As for people trying to spam, their site will most likely be rejected by the client within seconds, and if the same results are picked up by Google, ie 90% rejections in the first 30 seconds, the the site can be given a lower PR.
I also often return to google after I found some information that I am interested in, to search some more.
I will also do a search and open several of the links in seperate tabs. I leave the ones that are the most interesting open and close the others.
I do most of my searches while I am involved in other things. I might be on a totally irrelevant page, when I am interupted by real life for 10 minutes, before I can get back to what I was doing.
Oh yeah, turn on your network analyzer and check your traffic when you hit your back button to go back to your google page. Do you see that additional request to get the page again? I didn't think so. Doing this would put a HUGE additional strain on their servers if they have to serve up the SERPs each time you hit back and they also have to run all the click through a redirect script.
I'm not saying that it would not turn up some possibly useful information. But there are a LOT of problems that you haven't addressed with this simple solution. If google is going to consider using information like this, they will consider these things, and I think they will come to the conclusion that there are much easier and cheaper ways to improve their results.
when you hit your back button to go back to your google page. Do you see that additional request to get the page again?
It could detect it in the next link visited.
I'm sure some mad scientist can work out the average time spent on an irrelevent page.
When your given the index, It could send info like;
was index no1 last page visited by URL?
if not, time between no1 link and no2 link.
was index no2 last page visited by URL?
and so on.
A bit more public and democratic than "you can do this, but you cant do that because that guy does it wrong"
I also often return to google after I found some information that I am interested in, to search some more.I will also do a search and open several of the links in seperate tabs. I leave the ones that are the most interesting open and close the others.
Too many 'I's BigBave, its about averages, not individuals.
Matt's answer: Google hasn't used clickthrough data, and there are no plans to use it in the future.
Now, this doesn't mean they don't collect and aggregate the Gooogle Toolbar data for purposes such as figuring out how many people tend to click on a "position #1" link vs. a "position #2" link and that sort of thing. I believe Matt did say something to that effect.
However, as far as factoring it in for determining rankings and results: no, they don't.
...According to Mr. Cutts...
how many people tend to click on a "position #1" link vs. a "position #2"
The Ford Foundation will be irrelevant to most people searching for Fords, but there are a lot of searcher that are quite interested in the Ford Foundation.
There may be some merit to what you are claiming, but it will be very difficult and computationally intensive to implement. There are other ways to get good results that will be much easier.
The reason I use "I" alot is quite simple. "I" am only able to talk for "me". I can tell you that I do get feedback from many other users, but I have no right to speak for them in this. I do know of many neophyte users that have learned how to launch new windows when they do searches as the best way of dealing with them, but I was personally speaking for me and how your plan breaks down for me.
Google will know if you clicked on a link, they can also detect if your URL returned back to Google to look for another.
Good for your site, no direct return to Google. But if the client saw your link and thought it was irrelevent, he/she would again return to Google, and I'm sure they could detect this.