Forum Moderators: open
When I see a downturn or upturn I go looking for the reason why, hand checking on the relevant engine.
I think the risks of automatic position checking are somewhat overstated. However, they are against Googles TOS and that is sufficient reason alone to avoid or minimise auto-checking.
The site you are checking will suffer if you do massive checking with enough different keywords to identify the site. Since most search results return thousands of results, identifying which site is being checked is very difficult.
Older versions of WPG could check multiple sites for multiple keywords, newer version probably still can. This generates many repeated requests in a very short time and it's very easy to identify as a "non-human" visit. A check on one domain or keyword "per session" is much more discrete.
As with most things, discretion and moderation gets it done!
The last things we want to do is violate our good friend, Google's TOS. We will continue to check by hand. Using the stat logs for referrals and kw's is a great idea that we have been using as well. Some Clients just want to know their position for certain keywords. Time to up the price for them and do it by hand.. :)
Happy Thanksgiving All!
1. It distorts actual searches performed by users verses those performed by automated position reporting programs. If an SE is trying to determine how many people really searched for "widgets" verses how many people were checking their positions these programs skew the results. You then end up with WordTracker and Overture saying many more people searched for "widgets" than actually did. It is bad for Adwords advertising when Overture and WordTracker say 100 times as many people search for a term than clicks throughs delivered by Adwords. Of course checking positions by hand has a distorting effect also, but most people aren't going to check positions by hand in the same quantity as they would using an automated reporter.
2. Automated position reports burn the resources of Google, and Google doesn't see why it should be paying for that service!
3. We all like to know how many people visitor our web sites. I suspect Google likes to know how many searches its users perform also. These programs distort the "true" usage.
As Google has specifically requested people don't use automated programs to check positions a good net citizen would probably comply and respect this request.
If Google sent its bot to scan my sites every 10 seconds I would get pretty upset at the excessive usage of my bandwidth;) They don't do that to me, so I don't do it to them.
However, has anyone actually ever had a web site penalized for using these automated programs? I personally don't see how anyone could. How does Google know which web site you are trying to check the positions for? It could track your IP address, but how does it marry that to a web site? It could block an IP address from using Google, but I don't see how it can penalize a web site.
If you do a Google search for a string that is unique to these position reporting programs you will actually find lots of companies publishing these automated reports on the web (don't ask me why they do that?). But, a little investigation will also reveal that the associated web sites are not being penalized for it.
Bottom line: Don't do it because Google asks you not to, but I don't think anything bad will happen to your site if you choose not to respect their request. You might need a new ISP if you get banned from accessing Google.com of course;)