|Visiting own site for long periods with proxy to reduce bounce rate|
I'm doing an experiment by visiting my own site through proxies so the IP is to reduce bounce rate.
It seems to be improving traffic and rankings...
Is that possible or just my imagination?
How do you think Google knows how long you've spent on a site or page?
There has been quite a bit of speculation on this type of topic, but for the most part the speculation about the data sources for the information is not scalable, so I highly doubt your visits are impacting anything.
The only somewhat-reliable way Google could get the data in a web-wide-applicable way is if your visits are using Chrome, other than *possibly* that [I don't think they use Chrome data to that extent personally, but they may], the algo doesn't have access to the info they would need to affect rankings, so my guess is it's a coincidence [or in-other-words correlation != causation].
My Google Analytics seems to be strongly influenced by my own visits, this is because I often make intervents on my blog, install/remove plugins, optimize local anchors and so on. I pass very long time on my website and I think proxy could be very useful for slowing theme, but I'm not sure this would be so practical for my daily work.
It depends finally from proxy lists you exploit, I suppose.
Google's John Mueller has stated time and time again that Analytics data is not used in search. You're fretting over nothing and making your life unnecessarily complicated. Just a coincidence if you saw a positive impact on rankings after this I feel.
|Google's John Mueller has stated time and time again that Analytics data is not used in search. |
Google has shown many times that one hand does not know what the other is doing. If analytics data is not used in search, then how does (or will) Google use it? I doubt that Google gives this service away for free from the kindness of their hearts. And with recent skirmishes with privacy issues, I doubt Google will not share this data among themselves for a variety of purposes both now and into the future.
Google stipulates the following terms and conditions in regards to Analytics (http://www.google.com/analytics/terms/us.html):
|We use the information we collect from all of our services to provide, maintain, protect and improve them, to develop new ones, and to protect Google and our users. We also use this information to offer you tailored content – like giving you more relevant search results and ads. |
Google is well within their rights to use all of the data they collect, from our use of their services, in any way they see fit internally.
Oimachi2, can you isolate this as the only possible factor that could have affected your traffic and rankings? If you can then correlation IS causation, but its very difficult to conduct SEO experiments in a perfect vacuum so it's almost impossible to prove its not just coincidence.
Nobody knows what information Google has access to or uses in the algo but its reasonable to assume that if they were able to determine anything of value from user metrics and had a reliable, representative data sample for it to be meaningful data then they would use it.
How quickly did you see improvements after trying this?
|Google has shown many times that one hand does not know what the other is doing. If analytics data is not used in search, then how does (or will) Google use it? |
If Google's left hand doesn't know what its right hand is doing, then why assume than the left hand is capable of monitoring the right hand's visits, time on site, etc.?
And if Google were that obsessive about tracking the site owner's browsing behavior, wouldn't it be capable of identifying and filtering out the site owner's visits to his or her own site when gathering user statistics for search purposes?
And let's not lose our sense of perspective: If a site has a decent amount of traffic, the owner's own visits are a drop in the statistical bucket.
I'm with n00b1: "You're fretting over nothing and making your life unnecessarily complicated."
|If Google's left hand doesn't know what its right hand is doing, then why assume than the left hand is capable of monitoring the right hand's visits, time on site, etc.? |
If Google says that they use the information they collect from all of their services, then I'll take them at their word.
|If Google says that they use the information they collect from all of their services, then I'll take them at their word. |
What companies can do with data and actually do with data may be two different things, but let's assume (just for the sake of discussion) that Google is tracking every individual's activity compulsively. That brings us to what I said earlier:
|if Google were that obsessive about tracking the site owner's browsing behavior, wouldn't it be capable of identifying and filtering out the site owner's visits to his or her own site when gathering user statistics for search purposes? |
Also, "bounce rate" is a noisy signal. One could just as easily argue that, if Google were letting a site owner's activity influence search results so easily, a site owner could spam the search results merely by spending a lot of time (and looking at a lot of pages) on his own site.
There are ways to remove the noise from bounce rate.
If a site ranks for one term that generates 10 visits a day and you can generate artificial user activity that google cant distinguish from genuine, and your rankings improve afterwards.....well, it could hint that google is employing user metrics in the algo.
Bing have admitted to using user metrics in their algo, collected from Internet explorer so it doesn't seem a big stretch to imagine google might do the same.
|...collected from Internet explorer... |
That's not an analytics program that doesn't have a chance of being installed on [gathering information from] every site on the Internet -- What would Google do for stats on sites like Apple, Amazon, etc. that don't use GA, guess? GA use doesn't scale Internet-wide and they don't do things that don't scale.
Chrome data? Sure, to some extent, but not GA stats.
|Also, "bounce rate" is a noisy signal. One could just as easily argue that, if Google were letting a site owner's activity influence search results so easily, a site owner could spam the search results merely by spending a lot of time (and looking at a lot of pages) on his own site. |
Noisy or not, there are a number of black hat services designed exclusively for improving bounce rates from Google organic traffic. Although it may be easy for Google to filter an individual proxy, as in the OPs case, these black hat networks exist to manipulate another metric that Google does evaluate.
Regardless of how effective improving a bounce rate may or may not have on organic search positions, I stand by my previous statement that no company will invest millions of dollars in the development, maintenance, hardware and bandwidth to collect data that they do not use themselves. No business executive in their right mind would approve such a cost without some sort of return on their investment. Google's benefit, in this case, is having full access to traffic/visitor data from millions of websites. In fact, Google has even more detailed data then what they display to those who are using their analytics product (ie. specific data regrading IP addresses).
Google has some of the most brilliant minds working for them. When their algorithm encounters sites that do not use analytics, it would be real easy for them to program their algorithm to evaluate data from their front end search product instead of detailed analytics data if it is not available. A simple switch could be coded if GA is installed to use dataset 1 and if not dataset 2. With over 10 million sites using Google Analytics, and mainly big or competing brands that do not, I'd say that Google has a large enough sample pool of data to evaluate.
|Google has some of the most brilliant minds working for them. |
In that case, it shouldn't be too hard for Google to filter a site owner's visits from any visitor information that it may be gathering if such visits are likely to influence Google Search rankings.
Now, if we were talking about Alexa...
|I'm doing an experiment by visiting my own site through proxies so the IP is to reduce bounce rate. |
Oh, come on. Unless you have the world's smallest site-- which I can assure you you don't-- or your "proxies" represent an army of carefully programmed robots, you can't possibly do enough single-handed to have any effect on a search engine's data.
|In that case, it shouldn't be too hard for Google to filter a site owner's visits from any visitor information that it may be gathering if such visits are likely to influence Google Search rankings. |
Brilliance is probably not required for this, but merely common sense. Many proxies reside on servers that are located in datacenters. Any visitor traffic originating from a datacenter would be suspect. For some clients I block visits from many host IP ranges because of scraping and spam. Google, with their great ability to data mine, probably has very detailed information regarding the past usage of specific IP addresses.
Because many proxies are heavily abused for spamming, it would be relatively easy for Google to map out the history of proxy IP addresses and toss their data value out if they were used for spamming. For example, Google owns reCaptcha which is a common safeguard on forums, blogs, social bookmark sites, etc. to prevent spam. A proxy used to spam these types of websites would reveal its IP address when sending spam through Google's reCaptcha service, be logged and discounted for any type of data used for positive rank analysis.
And for those who want to know, reCaptcha is another "free" service owned by Google. And like Google Analytics, I'm sure Google did not buy this company to give everyone a free spam prevention measure out of the kindness of their hearts. It's another point of contact where Google can obtain data.
Simply put... This is no long term strategy for success.
Well this is not about Google analytics...
The bounce rate IS reduced in statcounter and webmaster tools and I jumped from #5 to #2 immediately and some of my long tail keywords are back.
Google DOES state that bounce rate is an algo parameter.
So I conclude that it does work, getting better rankings everyday actually;)
The proxy changes on every visit also from the TOR browser, so Google has no clue I'm doing this.
While this might not be full fix for every site, it seems to help a little, a daily visit per day of over 30 minutes does it. It changes the bounce rate dramatically.
And no, this is not a spam site, it's a moving company in Canada.
The only other parameter that decreases bounce rage is having visitors fill out the form to get an estimate.
This is not a website than can benefit from "social"signals ect...
Clients just want a rate and then they leave, period.
So very hard to make them "stay" on the site ;)
If they don't need to stay, why do you want to pretend they do? Even google must realize that some sites fill their visitors' needs on the first page. It doesn't automatically mean it's a bad site. It could mean it's a very good site.
|Google DOES state that bounce rate is an algo parameter. |
|...Google has no clue I'm doing this. |
That's absolutely true. Neither does the algo.
[They don't even care.]
|Even google must realize that some sites fill their visitors' needs on the first page. It doesn't automatically mean it's a bad site. It could mean it's a very good site. |
+1 great point! And, interestingly, as I've posted previously I've had more than one page with a +90% bounce rate in the top 3 for multiple queries years.
It's also easy to see they were/are obviously the pages the visitor was looking for based on the queries, so the "manipulating bounce rate to rank higher" argument really doesn't hold much water with me -- Fortunately for me, Google's smart enough to figure out bounce rate can indicate a number of things and is unreliable as an indicator of quality or non-quality.
Most definitely yes.
|Is that possible or just my imagination? |
Of course it is...
I think you might be on to something!
[edited by: aakk9999 at 7:15 am (utc) on Sep 23, 2013]
Google collects the visitor data, crunches the numbers and displays the bounce rate to users for a reason. That reason is to allow webmasters to make improvements to their sites that improve visitor retention. While an unreliable signal it can be, a site's bounce rate may be used by Google in certain conditions where their algorithm deems necessary. I believe this is how Google uses bounce rate data - selectively. Any public commenting on bounce rate usage by Google employees, as it relates to search positions, is likely going to be hard to find if it is part of Google's "secret sauce," as Matt Cutts puts it.
Where some of the black hats are using these types of bounce rate improvement services, I've seen enough testimonials from their buyers to indicate it may impact search positions. But without knowing what else they may be doing, and the types of sites they have, it is not conclusive evidence to form a concrete opinion. Regardless, these types of services are far more than one guy and a proxy. These services are built on networks where each participant "helps" another. Since there are no links, cloaking or other flagrant violations of Google's guidelines, these participants feel safe in what they are doing and most report some level of "success."
If you drill down into your own analytics, you may find that some of the keywords you are ranking better for tend to have a good bounce rate. Whether this is just a coincidence or evidence of a link between search positions and bounce rates is in the eyes of the beholder. When it comes to Google's secret sauce, I don't think they will talk much about it and would be compelled to offer either vague or misinformation to the public. I think Google learned their lesson from discussing too much publicly about links/anchor text and do not want to repeat the same mistake twice.
|When it comes to Google's secret sauce, I don't think they will talk much about it and would be compelled to offer either vague or misinformation to the public. I think Google learned their lesson from discussing too much publicly about links/anchor text and do not want to repeat the same mistake twice. |
So the following statement I quoted when asking for the source is false, correct?
|Google DOES state that bounce rate is an algo parameter. |