Forum Moderators: open
I have been running [snip] 24/7 to check my company's web pages' Google rankings for many thousands keyword combinations. Only today did I find out that Google frowns on this practice.
I telecommute. I live in Wisconsin, and do web site promotion for a company whose servers are all in Indiana. We obviously use different ISPs, and my IP address is dynamically assigned to me.
Is there any way that I can get my company's web sites, or ISP, penalized by Google by abusing this automated rankings checker software from another state?
thanks in advance!
ps:
wonderful forum. I have been responsible for Web site promotion for 3 months, so I was very exited to stumble apon this resource.
[edited by: pageoneresults at 4:49 am (utc) on May 25, 2004]
[edit reason] Removed Specifics - Please Refer to TOS [/edit]
Don't use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our terms of service. Google does not recommend the use of products such as blahblah blah that send automatic or programmatic queries to Google.
This is G's stance on this but the question is whether or not they can do anything about it. I suspect not, but these programs have only a very limited value and why would you want to check the results 24/7? Do you know a method that you can use to modify your sites and have them re-ranked 24/7?
For SEOs that manages more than a handful of websites such tools are not only helpful, but necessary.
Here's an example:
You monitor and manage the SEO for well over 1000 websites. There is a large set of web pages which suddenly vanish off of some rankings in Yahoo. You revisit, tell the owners about some possible problems which are causing it to be banned from Yahoo, you clean it for them, and submit to SiteMatch.
Back on, all within a much shorter cycle of time because you were monitoring on a pro-active rather than re-active basis.
Is there anyone who feels that using these tools is significantly unethical, since it violates Googles tos?
"The way you can tell that you have matured as an SEO engineer is that you no longer need this type of software. "
I agree, but I am a novice. I make reports that show every 2, 3 and 4 word combination that appear on each page, even if the search is gibberish. The tool lets me compare each day's report to previous reports.
In this way I can follow a page's rise and fall, and learn which techniques actually matter to Google. When I find a page does well for a nonsense search, I use that page as a template to make new pages with real keywords where the nonsense keywords were placed.
so far I have 'learned' 3 things:
1) PageRank does not directly effect a page's ranking.
some of my pages got PageRank last saturday, but this did not impact their rankings AT ALL. Also, the pages that got PageRank did not particularly rank higher than other pages, they just happened to be in the same spider path.
2) Google does not mind html errors like a missing head tag!
I experiment with pages that have similiar, but not identical content. pages without a <head> tag do just as well as the properly formed pages.
3) my site resolves to 'site.com', or 'www.site.com', and Google thought some pages were duplicate content on different servers. When I cleaned up the incoming links, this problem began to effect fewer files, and their rankings began to improve dramatically.
"It is an addictive drug that is hard to quit. When you do it is like having a large stone removed from around your neck."
Yes, you got that right. I spend 4-8 hours a day examining this raw data, and it is not billable labor! However, I am learning a LOT, and using this raw data to confirm or reject my crazy theories.
You should not be concerned with rankings. Rankings alone do not get traffic to your site. Have you checked your logs to see if the words you think are being used actually are being used?
Give me logs anyday so I can show my client an actualt increase in traffic and ROI, and they will be happy. You have to educate your clients, and yourself.
As far as if it's ethical, no it's not. It's simple. Google has said that they don't want you to use their servers in that manner. It's their servers, and their bandwidth. If they sak you not to, and you still do, it's unethical. That's not the only reason to not use the sotfware though. The biggest reason is, it does you no good at all.
1. billing your for automated output of program (not worth a large fee. a small fee perhaps, but not a large fee)
2. probably not paying much attention to your site unless there is already a problem
3. probably engaging in similar automated SEO techniques to get the ranking, which can be very risky for the domain name and not worth a high fee
4. probably finding those things that you rank highly for and taking credit for getting you that ranking, as opposed to gaining you top ranking for the keywords you need. Ever notice how they don't provide reports that say you rank 5000th for keyword X? They emphasize how you rank 1st for keyword X's misspeling [sic].
I agree, but I am a novice. I make reports that show every 2, 3 and 4 word combination that appear on each page, even if the search is gibberish. The tool lets me compare each day's report to previous reports.
Gibberish is telling you nothing. When I started a couple of years ago I used to spend hours each day looking at the data from one of these packages. If you have four hours idle time each day you would be better spending it in this forum. Take the advice of someone who has been through this process.
Listen to Ogletree and use your loaf, woops, I mean your logs!
Greetings,
Herenvardö
Probably not. But they might take a closer look at the site because of the annoyance. Is your site squeaky clean?
Is it unethical? Absolutely. You are violating their TOS, as well as violating their robots.txt. They are willing to honor your robots.txt, so why are you unwilling to honor theirs?
Personally, I think that when they receive queries from IPs that appear to abusing their system, they should return bad results to make the softwaare useless for SEO.
As the others said, those ranking numbers are pretty useless. It is your traffic and conversions that count, not where you rank on every possible 4 word term, or even what you consider your main keyword.
Just check your top 100 keyphrases once a month (the same for your PR) to see if any of them have changed, or changed positions. Other than that, just look at your traffic.
Ogletree, writing software to parse HTTP logs and determine whether or not you have lost ranking on search engines from over 1000 different websites is no mean feat.
However, if you can suggest a programmaticaly deterministic way to do that, I am certainly willing to learn.
you are getting 0 traffic for xyz keyword (through log u get to know this), what will you do,
will you still optimize the pages and get more links, oops your page might be the #1 but still not getting traffic due to some other reason, may be bad title. So there is no way you can avoid checking positions.
This is like Marketing reasearch, you need SERP position as an attribute, if you get it automated, no harm as it helps in deciding next strategies.
Thanks
Aji
This is like Marketing reasearch, you need SERP position as an attribute, if you get it automated, no harm as it helps in deciding next strategies.
"no harm" to who? It certainly harms Google, because they have to pay for the servers and bandwidth.
As long as you do it with something like the API, then it is fine. Otherwise, there *is* harm, as you are stealing bandwidth from the search engines.
If you really must check a couple hundred keyphrases, I would suggest that you hire a college intern and have it be part of their job to check the position of each keyphrase once a week. Pass the minimum wage bill off to your client.
Just make sure that you brand your title, set your preferences to 100 results, run your search on [google.com...] and do a ctrl-F search on your brand name.
It will take them only a few seconds a search, and you don't violate any TOS.
There's a site called Uptimebot which doesn't check keywords, but will give general stats about your site and how many backwards links, page rank, indexed pages, and other stats for major search engines.
Does this site violate Google's TOS? I didn't see anything in there about overall stats. I don't know a lot about how it works, but we've started tracking our overall positions with their tools.
GG, any advice about that one? Sticky me if you need the URL.
-- G
Screen scrapers don't let Google know who you are. They just type in a term and read all pages. Google has no idea who is doing it.
This is correct but I don't know if they actually could find out out who was responsible for the offending searches.
I know that the most popular rank checking package does not pass the user's URL to Google during a KW check or when verifying URLs.
When doing a URL verification it queries for a specific KW phrase then scans the results for the user's domain name offline. This means that Google cannot currently determine where the inquiry came from.
This does not make it right and ethically, if you use Google, you should not use these packages. If Google starts to behave unethically, well ...
The large majority of web sites on the Internet have TOS which state for private use only, not to be used for commercial reasons. They also have copyright notices everywhere.
Google spiders the site, ignores TOS and copyright issues, pulls all the content back to its servers and sells that content to third parties like Yahoo, AOL, Earthlink etc.
Probably actionable in a court of law, most certainly unethical.
So what is the difference between what Google is doing to all these sites and what automated rank checkers are doing to Google? The automated rank checkers are going in for a client and reporting that client's position in google back to them. IMHO, the automated rank checkers are more ethical than Google because they are just reporting back on the presence of sites that Google nicked in the first place!
So, stop bleating on about poor old Google and how we shouldn't breach its TOS. You fight fire with fire. Google is up to its neck in it and should just get on with its job of serving searchers in whatever format they come. If it gets hit with additional bandwidth requirements, well folks that is a small price for it to pay for nicking 4 billion pages of copyrighted content.
Also, if you don't want Google to visit, then set your robots.txt file
I am not talking about whether you want google to visit or not, I am talking about what google does without permission. We are taling ethics here, not some nebulous standard set by the search engines themselves.
But in any event, dealing with your point, say I launch a great web site with 100,000 pages. Google spiders every page and causes me to get a bill for excess download. Do I have cause for complaint? Yes. Is a defence for google that I should have set robots.txt? No. That is a standard among themselves. It is not a standard that I agreed to. In fact, and I'm being very difficult now, it is not a standard that I will ever agree to! No search engine can make me put up a robots.txt file. It is quite simple - I have TOS that say don't use for commercial reasons, so no one should use for commercial reasons.
Now relating this to the reverse situation, Google have TOS that say don't send automated queries. But automated rank checkers (just like spiders) don't read TOS and think they are perfectly entitled to query in any way they see fit that is legal.
So what is the difference between what Google is doing to all these sites and what automated rank checkers are doing to Google?
Greetings,
Herenvardö
Let me give you an example - there was a time way back when, when we only achieved top ranking in msn for our clients, so we reported on those search engines. Clients were quite happy and considered msn an important search engine. Then we cracked google and started reporting on it (and of course traffic increased considerably) and suddenly google was all that clients were interested in. That was out little contribution to google's brand, and add everyone else in the same boat and you have quite a significant impact.
If google tomorrow took action against rank tracking such that it was effective, then the outcome would be that SEO companies would attach less importance to google and that would have a dilutory effect on google's brand.
So I think you should differentiate between google's very broad TOS (which by the way also say don't optimise, which in turns means that we are all unethical) and the practicalities which are that if you are servicing clients you have to report to them on the outcome and that by necessity means rank tracking, whether manually or using a tool.
Once you reach that point, how can doing it in an automated way be unethical?