Welcome to WebmasterWorld Guest from 188.8.131.52
[edited by: tedster at 12:05 pm (utc) on Jun 15, 2010]
Matt goes on to mention factors like "how long do people stay"
What about sites that DON'T have GA installed?
And couldn't webmaster's hire a couple of kids / bots to browse their sites for long periods of time, thus increasing time on page and pages visited, and reducing the bounce rate?
GA is your site's worst enemy. Bounce rate can get rankings dropped for some pages.
GA is your site's worst enemy. Bounce rate can get rankings dropped for some pages. I've proven this myself.
meaning broken links, invalid html/xml, etc.? Or do you mean something else by "technical missteps"?
>>> If you'd like to explain your theories about how my technique
>>> interacts with the Google Toolbar, I'd be happy to criticize them.
Alright. The toolbar can monitor user behaviour. It'll detect that your visitors sometimes leave your site towards another one. How? The bot didn't see any links there? Quite a contradiction, is it? Something must be wrong with your site, let's send in the next quality rater available. S/He comes with the source code in hands the bot has crawled
I didn't realize Google were the Internet Police - but its obvious they are!
They are not looking to whittle around my site for hours. They get what they need and bail. Does this mean we should rank lower? I mean we are the most up to date site out for this stuff. We keep ads to 1 per page which techs love. Bottomline we make the user happy.
Is this why we are seeing spammy almost blatant made for ads sites with little history take over our positions? Because they make the user hop through 10 pages to get that script they are looking for? Oh and lets not forget the user must look at pages and pages filled to the brim with ads and internally scraped garbage.
[edited by: dvduval at 4:55 pm (utc) on Jun 20, 2010]
I see. I have a client whose highest search traffic page has an 85% bounce rate, but they still rank #1.
joined:May 13, 2010
joined:May 13, 2010