Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

Can creating white-hat ranking software be <> total self annihilation?

Can creating white-hat ranking software be <> total self annihilation?



12:01 am on Feb 8, 2006 (gmt 0)

5+ Year Member

Hi... First post here...

I am creating a shareware (thereby "commercial") program that can in general analyze a website. So far, so good, none of it involves Google.

However... To be "overall" competitive I need at least some sort of position checking. Again, my program may (or may not) outshine all competitors out there with other features... But if I have zero position checking... Well, you get the idea.

I would be more than happy with e.g. either just parsing the first page from Google results (e.g. a page with
50 results) or using the WebAPI.

I would also be happy to include other sorts of limits. (e.g. NO concurrent requests - and enforce small "sleep/idle" periods etc.)

I am trying to create an all-around program for those small IT professionals shops that do all sorts of different work for their (local) clients. I am not up to "deep-dig" Google. I am also not in any way helping/promoting black-hat stuff in the software. (or anywhere else.)

All I would like to do (without getting de-indexed, sued, annihilated from face of earth *G*) is some simple position checking within first results page.

Also, how is "automated queries" defined? Would it be ok to have user enter/select a few phrases, retrieve google output (first results page for each phrase), and analyze HTML for website positions?

Does anyone know / have experience with if Google accepts non-abusive usage? (As I would prefer not to have my
entire website de-indexed, hehe.)

I have no idea if I am being an idiot asking this... As I have no search engine contacts that can tell me :-)

[edited by: engine at 3:00 pm (utc) on Feb. 8, 2006]
[edit reason] formatting [/edit]


5:48 am on Feb 9, 2006 (gmt 0)

10+ Year Member

Unless I do not understand what you're saying, I thought there were a number of products that check Google ranking. I have used several.

Are you talking about a web-based application? Is that your concern? The ones I am thinking of are pc based.


6:27 am on Feb 9, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

I think that you nmay eed to contact google around using Google API's to allows for 1000 queries per day from any one ip.


7:29 am on Feb 9, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

remember webposition gold? Sites using that stood a chance of getting nuked.


7:43 am on Feb 9, 2006 (gmt 0)

In my experience guess work gets you further than using web positioning software.


8:00 am on Feb 9, 2006 (gmt 0)

10+ Year Member

Use the Google API in your software, then have each user enter his/her own personal API key. That way every user will be able to run 1000 queries/day (and *you* would not be at risk).


8:02 am on Feb 9, 2006 (gmt 0)

10+ Year Member

Like Eltiti said, incorporate Google API in your application and when your done engineering your new application give us webmasters here a freebie!


12:49 pm on Feb 9, 2006 (gmt 0)

5+ Year Member

If you read the license Google forces you to ask for permission to make "commercial" software that INTERACTS with the Google API... Infact, as I read it, everyone who are doing any kind of website/SEO as professionals are also breaching the license. Quote:

Personal and legitimate uses only
The Google Web APIs service is made available to you for your personal, non-commercial use only (at home or
If you are interested in doing anything different than the foregoing, you must first obtain Google's written consent. If you fail to do so, Google reserves the right to take legal action.

Oh... And yes, I HAVE tryed asking. However, they just completely ignored me... (the autoresponse email says they have closed for all new commercial applications.)

Anyways, I have researched the matter... I believe it is possible to make a /generic/ scraper which users themselves can configure.

I believe I can enforce "fair use" by:
* Enforcing minimum "idle pauses" between queries and page requests.
* Enforcing maximum page/results limit. For normal position checking, top 100 or 200 should be enough. (1-2 pages * 100 items.)

I can also (in my generic configuration files) allow for more restrictions.

In that way I will "side-step" most troubles. But I hate the situation. Dozens and yet dozens of "high-profile" well-known commercial web and software tools are allowed to use WebAPI / query Google in other (old) ways. It just totally annoys me. I can see myself making great tools (heh), but I seem to be tied down in ways others are not.

So my conclusion is: Either Google is supporting some "elite" hardcore SEO companies (thereby skewing the market for the small fish) making it impossible to enter.


They are accepting "fair use" (it is the webmasters websites that provide the content in first place - only fair if they are allowed to interact in some way.)

As Google likes? :-) "Web2" and "do not be evil" slogas, it only seems natural to believe it is the latter option.

*yawn* I have spent much time thinking and working myself around this. *sob*


12:43 pm on Feb 10, 2006 (gmt 0)

5+ Year Member

Just to be sure...

Is it your impression that the wording
"The Google Web APIs service is" ...

is only meant for those getting a Google API key, and not software interacting with the with Google Web API service (using a user-supplied API key)?

In that case I've read it wrong all along.

Well, I suppose I will then offer both technologies then... My generic-old-school-postion-chekker (where I limit e.g. query parameters to maximum return e.g. top 200 results when reading the configuration files) and the WebAPI for Google.

The generic-old-school-postion-chekker will still be
useful for other/local engines and whatever.


Featured Threads

Hot Threads This Week

Hot Threads This Month