Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
However, I have noticed that my position in Google has always dropped after leaving the adwords program, and always came back after the program, with about a week or two week lag.
I don't think google would do this on purpose, but could their be a bug or is the search engine dynamics so complicated, that adwords is giving an unintentional boost to the free listings under Google?
The last episode was when we dropped our adwords during our slow period from early december to mid janurary. We dropped out of Google about a week later, and now we are back about a week after starting our adwords back up.
I wish I had documented all the dates in the past, and I will in the future, as Googleguy may be interested in this. I am now going to keep carefull track of my free google listing and my adwords listing.
ever wonder if rankings were tied to - say, clickrate via the toolbar? Ever consider that when you buy adwords, that naturally your click rate is going to go up?
I was not aware that the takeup of the toolbar amongst joe public was that great yet.
Also, click-through rate is not a measure of user satisfaction. Whilst I don't doubt Google have considered this technology, I do doubt that it would improve already good results. Of course, with current results .....
Also, using click-through rates to modify SERPS is rather like positive feedback. Normally this is to be avoided since it reduces stability, however, with search engines, ironically, it would cause greater stability. Is that a good thing?
Pollsters can get a pretty good idea what the population as a whole in the US thinks by interviewing a sample of 2,000 people or so. Main concern for Google would be how representative of a sample of all Google users those with the toolbar are?
Number of visits wouldnt mean much if you 'traded' clicks, had lots of advertising $, etc etc, but the time spent on a site, relative to the amount of traffic it receives.. now that might work, no?
just my 2cents...
As I've said before, analysing user behaviour to measure site relevancy is close to impossible. Certainly, this is true with current technology. However, I bet there are loads of people out there that reckon they can write software to do it but they'll want big bucks.
Twenty years ago, reading handwriting by computer was an impossible dream - now such technology is well advanced (though far from perfect). Analysing user behaviour is just about possible, but do we really want that kind of software spying on us - I don't think so.
...on the 5th site views 20-40 pages, I think it is safe to say the user may have found the 5th website most useful. Again, if this is true it comes down to content and ease of navigation.
Or it may mean they're poking all around the site trying to figure out why their keywords aren't there and why the site appeared on their search when they can't find anything relevant. I have frequently spent more time on unrelated sites for this reason, and I doubt I'm alone.
On the other hand, on a well-designed site I should be able to find the piece of information I seek almost instantaneously. Then I can move on.
A more useful metric would be to track which site is the last site they visit on the search, without coming back to the search page. That should mean they found what they are seeking.
Yes I agree that a badly structured site could make user visits longer, but wouldnt you rather visit a competing site that had a better site structure? Hence most users searching through a number of sites would skip one over the other.
Or how about repeat visits to a site? If user1 found a site via google serps and then repeated to visit the same site over and over again, would that not mean the site was at least popular to him / her? Now if 10,000 visitors all visit the same site over and over, spending 5 mins a day, not visiting any other similar sites for longer than 30 seconds, can we not assume to a high degree that this site is more popular than its competitors?
Just speculation i know.. but im not trying to convince, my intention is to speculate...
Or how about repeat visits to a site?
This may be something they are tracking, but is a dangerous metric. It would kill focused niche sites that usually offer the best information and benefit the bloated Silicon Valley behemoths.
If that happened, the serps would look like . . . hmmm, about what they look like now?