Forum Moderators: Robert Charlton & goodroi
When asked if an iteration of Panda was implemented this week, a Google spokesperson told us, “yes.” She also provided the following statement:
“We’re continuing to iterate on our Panda algorithm as part of our commitment to returning high-quality sites to Google users. This most recent update is one of the roughly 500 changes we make to our ranking algorithms each year.”
If you’ve followed the Google Panda update saga throughout the year, you may recall Dani Horowitz’s story. She runs an IT discussion community called Daniweb, and it was hit hard by the Panda update, but she made a lot of changes, and gradually started to build back some Google cred
about bounce rate, mostly 90-100 on many pages thats be cause they see exactly what they want, look at it or mostly download it at once
I wrote a while back in some other forum that the less i care about Google the more Google rewards me with free traffic.
I started to nocache than block images and now i blocked about 50% of my site for Google. I tread Google more and more as if it doesnt exist. I take the visitors I get from Google and googlize them, meaning trying to make them stay, leave their email adress or social footprint so they can come again directly without Google.
While my traffic had a steady growth over the last 5 years the share from Google fell from 80% to now down to 40% so i am less dependend on G. Maybe that is one of the new factors from Google, it rewards sites that would work without Google just fine...?
which means at least two pageviews per visit
After all, such social networking sites generally are designed for people with short attention spans...
The healthy sign for entry pages coming from organic search results is a "long click",
Bounce rate on its own is not relevant. Time on page/ site on its own is not relevant. Both are only relevant in the context of whether people return to Google to "try again" by clicking another search result or perform a similar search.
Both are only relevant in the context of whether people return to Google to "try again" by clicking another search result or perform a similar search.
Update on 5/30/10: Matt Cutts from Google has posted a YouTube video about the change. In it, he says “it’s an algorithmic change that changes how we assess which sites are the best match for long tail queries.” He recommends that a site owner who is impacted evaluate the quality of the site and if the site really is the most relevant match for the impacted queries, what “great content” could be added, determine if the the site is considered an “authority”, and ensure that the page does more than simply match the keywords in the query and is relevant and useful for that query. [searchengineland.com...]
It's possible that bounce rate and time on site is not used but correlates with whatever Google uses.
Is this a fact or just a hypothesis expressed with great confidence?
It's possible that bounce rate and time on site is not used but correlates with whatever Google uses.
Yes I agree, but if I am shopping for widgets and click on a site only to find out the widget costs $$$, am I not going shop around to find the cheapest? immediately I am going to check 2, 3 or perhaps 5 other sites to see what is the cheapest, which will likely be seen as a high bounce rate.
Bounce rate on its own is not relevant. Time on page/ site on its own is not relevant. Both are only relevant in the context of whether people return to Google to "try again" by clicking another search result or perform a similar search.