Forum Moderators: open

Message Too Old, No Replies

Automated Rankings Checker Dangerous with Google?

Can rank checking program hurt my site if I telecommute?

         

sethwhite

4:38 am on May 25, 2004 (gmt 0)

10+ Year Member



Hi Smart Folks:

I have been running [snip] 24/7 to check my company's web pages' Google rankings for many thousands keyword combinations. Only today did I find out that Google frowns on this practice.

I telecommute. I live in Wisconsin, and do web site promotion for a company whose servers are all in Indiana. We obviously use different ISPs, and my IP address is dynamically assigned to me.

Is there any way that I can get my company's web sites, or ISP, penalized by Google by abusing this automated rankings checker software from another state?

thanks in advance!

ps:
wonderful forum. I have been responsible for Web site promotion for 3 months, so I was very exited to stumble apon this resource.

[edited by: pageoneresults at 4:49 am (utc) on May 25, 2004]
[edit reason] Removed Specifics - Please Refer to TOS [/edit]

blaze

4:48 pm on May 27, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Again, I do not think Google would ever persue anyone unless they were doing a DOS attack against their system.

However, they will ban your IP address and if that's what you want..

Pimpernel

5:10 pm on May 27, 2004 (gmt 0)

10+ Year Member



That is absolutely fair enough and is the proper thing for them to do. If you overstay your welcome you get chucked out, seems perfectly reasonable

BigDave

5:12 pm on May 27, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Pimpernel,

I have heard a lot about how Google does not have a right to do what they do. But I have yet to see any case law about it.

If you all are so bothered by it, and you are convinced that google would fail a challenge, I would recommend suing them.

As it is, I think you may be surprised at what areas google could get some support from the courts. I'm not saying that they *would* find this way, but it is far from a sure thing that they would find in your favor. (I'm not even going to get into the copyright issues, which also might surprise you at how much can fall in Google's favor)

Courts are actually very big on supporting voluntary controls set by a community. In this case, robots.txt is a TOS for robots that any automated system can understand. You might have trouble convincing a judge that you are taking the high moral ground where there is an easy community standard to follow. Courts really hate it when you waste their time when there is already a perfectly viable solution available.

And I suspect that as soon as your copyright case made it to court, but even before discovery, the legeslative branches of a lot of countries would be jumping through hoops to give legal standing to robots.txt. Because, if your country does not allow search engines to do their job, they will simply stop spidering IP addresses alotted to your country, and the businesses in your country will suffer.

But who knows, you may win. And in the process, you will become more villified than SCO.

Pimpernel

5:20 pm on May 27, 2004 (gmt 0)

10+ Year Member



What on earth are you talking about? I am not suing Google! We are discussing ethics here and I am saying that Google's activity in spidering and capturing and selling pages that have copyright protection is unethical (and probably actionable). I am saying that this constantly repeated issue about the ethics of sending Google automated queries is absolute nonsense. We are doing what they are doing. I am saying that in any event google does not care a whole lot about automated queries unless it goes truly over the top, in which case it a simple fact of life that they will at the very least ban your IP address.

But it is not unethical!

BigDave

6:00 pm on May 27, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Oh? Is it unethical? They follow the community standard for robots. The automated ranking checkers do not.

From the OED definition of ethic:

c. The rules of conduct recognized in certain associations or departments of human life.

Google follows the rules of conduct for robots. The ranking checkers do not.

Therefore, with this definition at least, Google is indeed more ethical than the ranking checkers.

hutcheson

7:05 pm on May 27, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>I am talking about what google does without permission.

No, you're not.

Permission is granted or withheld by means of the robots.txt file and META tags. It has already been pointed out that Google carefully abides by the permissions specified there.

And so, as has been pointed out, if you change your mind about what permissions to give Google, you change those, and Google starts abiding by your new restrictions.

>We are doing what they are doing.

No, you're not.

Googlebot is abiding by the TOS of the website, as expressed in the standard generally used by the community. You are deliberately breaking the TOS of their website.

If you have difficulty grasping these concepts, you need to hire a geek.

By your definition, a bank robber and a customer both "do the same thing" -- withdraw money. Of course, ethics seems somewhat beyond your grasp also.

ogletree

7:13 pm on May 27, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I don't think the orginal poster cares too much about the ethics of this. Their question was could I get my site penalized. The answer is no. Many people run those things every day with hundreds and thousands of searches with no problem. Google has no idea what site your looking at.

sethwhite

7:39 pm on May 27, 2004 (gmt 0)

10+ Year Member



actually, i stopped using the tool because it violates Google's tos. I would not want people using my company's web survey service in violation of our tos, just because we could not catch them. Also my employer insists that we adhere to the highest ethical standards. (He is a University professor, before he is a capitalist)

I found the raw data this tool produced extremely valuable for a novice. In the 3 months, I have been doing this, I have 60 pages online to attract searches. About 30 of them are in the top 10 for my targeted searches, many of which have over 4,000,000 results.

As I said previously, I used this tool to find 2 word nonsense phrases that my pages did well on. Then I used those pages as templates to create new pages, putting real keywords where the nonsense words were. After reiterating the process several times, I produced 5 different templates that Google seems to like.

I also used this tool to re-adjust the text in the title tags, when I found that a page was doing unexpectedly well for a keyword. When I added a new page, I could measure its effects on the other pages due to its keyword content and links.

I was a complete novice 3 months ago (master degree in cs, but no web site experience). Today, I can show my boss that we are in the top 10 for searches that he asked me to promote (most of which have over 1,000,000 results pages). I credit the raw data from this tool for helping me figure out a successful strategy, and I am reluctant to stop using it, but the fact that it violates Google's tos is sufficient.

it could also be argued that I could have made more progress creating new content and lurking in this forum, than studying raw data for 4-8 hours a day.

billygg

8:14 pm on May 27, 2004 (gmt 0)

10+ Year Member



i cannot believe there is this much heat in here about using reporting tools. i can understand both points of view, but, some companies rely on that information. i understand that u can use log files as well, such as webtrends. for instance, the company i work for, we do reporting, but, we dont do them for exact positioning to check, we use them, so our sales department can use the general information so when clients call in, we can give an idea of whats going on. with so many webservers, and such big files, it takes friggen 2 hrs to pull a log file, as apposed to running a 30 second report. i guess i see it as, use then if u must, but, there are ways around it.

paybacksa

8:25 pm on May 27, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Don't forget that the rest of the world doesn't know about, care about, or even consider important technological rules such as robots.txt.

It is often argued that a webmaster who put up content without a robots.txt should have known better or should have excluded harvesting through use of a robots.txt etc etc.... and it is usually strongly argued back that it doesn't matter if they didn't know how to stop undesired access, the acess was still wrong.

The old lock and key thing. Just because I don't have a lock on my door, doesn't mean you are free to trespass.

If anyone here wants to become the defining ruling on traspass to chattles with respect to web content, I look forward to reading all about it. Until then I expect a well-formed and prominant TOS will hold weight in court. The most obvious challenge I see is the value of the informed consent, and the technicalities of the language of the TOS -- not the validity or applicability of law.

rfgdxm1

10:29 pm on May 27, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>It is often argued that a webmaster who put up content without a robots.txt should have known better or should have excluded harvesting through use of a robots.txt etc etc.... and it is usually strongly argued back that it doesn't matter if they didn't know how to stop undesired access, the acess was still wrong.

>The old lock and key thing. Just because I don't have a lock on my door, doesn't mean you are free to trespass.

Bad analogy. A website is more like an open door, with a big sign over it saying "Come in, everyone welcome". Looks like you just don't grok this newfangled WWW thingy. The assumed default with a website is that it is accessible to all on the Internet. Now, if everyone isn't welcome, then it is up to *you* to put up a sign saying that. Robots.txt is the sign used on the Internet for that.

pageoneresults

10:40 pm on May 27, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Use the Google API, you have permission to do 1000 automated queries per day.

The Google API is for personal use. It is not for commercial use. Also, check the results returned from the API. They are different than those returned at the public level. The API queries a different database.

BigDave

11:27 pm on May 27, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



A website is more like an open door, with a big sign over it saying "Come in, everyone welcome".

I would suggest that it is more like a store in a mall. It is still private property, and the proprietor can still set the rules. But it is public access, and the proprietor must make visitors aware of any rules that are other than the normal rules to be expected for a public access space. And there are legal limits on what sort of rules they can make and still have it be public access.

There are still all sorts of copyright arguments that can be made in relation to Google, but they are pretty well in the clear for accessing your content while respecting the community standard "keep out" sign. The only thing that seems to really expose them is their cached pages, and even there I would not want to bet on the outcome either way.

Pimpernel

9:03 am on May 28, 2004 (gmt 0)

10+ Year Member



I think we are all getting a bit confused here. Let's just stand back for a second.

Google's guidelines say make web sites for visitors not for search engines. Don't employ any methods to artificially boost your rankings.

In other words, don't optimise.

Therefore, by the arguments put forward here it is unethical to optimise your web site.

Therefore, what exactly are those posters who believe it is unethical doing on this forum? Finding out how to optimise, that is what.

That's the first point. The second point is Google's site is a publicly accessible site. I can go there and carry out as many searches as I want to. If I choose to automate that process of searching to save me time, what business is that of Google's? If instead of employing 100 people to carry out all those searches every day that I need carried out I instead automate it, that is a sensible thing to do. It is not unethical in any way whatsoever.

Now one very important point. The bar against automated queries AFAIK is not part of Google's TOS. In fact, I cannot see any TOS for Google. It is part of Google's quality guidelines, which IMO is a very different thing. As an optimiser I will take Google's quality guidelines with a very big pinch of salt, and I believe that anyone who considers sending automated queries to google to be unethical is failing to understand the relationship between optimiser and Google.

victor

9:27 am on May 28, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google terms of service:
[google.com...]

"You may not send automated queries of any sort to Google's system without express permission in advance from Google."

Pimpernel

9:42 am on May 28, 2004 (gmt 0)

10+ Year Member



Well done - just found it myself, with great difficulty I might add. No link on the home page

curryrivel

1:09 pm on May 28, 2004 (gmt 0)

10+ Year Member



Surely, you cannot get your IP address (and so your-domain.com) banned by Google for using a rankings or other checking tool.

How about if I hammered an auto checker service continually, which is very easy to do and analyse my-competitors-domain.com

Google would be wide open for bad publicity if it banned innocent domains that were being undermined in this manner by unethical competitors.

sem4u

1:13 pm on May 28, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



No you can only get your IP address banned that you connect to the internet with. If your IP gets banned you cannot connect to Google.
This 48 message thread spans 2 pages: 48