Forum Moderators: coopster
I'm creating a web app, and hope to automatically extract information from Google's search results. For each search on my site, anything from 10-100 Google pages may be looked at. I would obviously credit Google.
Last time I heard, automated Google searching was frowned upon, but it was still possible as long as you didn't perform too many automated searches.
What I want to know is:
a: Is this still possible, and if so where is the link? It seems one can use AJAX with this page [code.google.com], but I just want to use PHP to gather the info, not AJAX.
b: What's the maximum number of megabytes or queries of search I can automatically extract a day without annoying Google? Last time I heard it was 1000 queries a day.
c: Can you pay for further automated querying?
d: For speed's sake, how many connections can I open at once to the Google servers? Obviously I'd love to open say 10 connections at once as this would speed up the search page downloads.
For your other questions, you are probably going to have to refer to the Terms of Use and the FAQ [code.google.com] for additional information.
Okay never mind the language for now - after studying the FAQ [code.google.com], I couldn't even find out if it was possible to scrape results, let alone the limit on queries per day.
However, after studying the group for this API, it turns out that it's not posssible [groups.google.com]. That is, automated querying is not allowed to scrape search results. This is unlike their old deprecated SOAP API which was able to allow up to 1000 queries per day.
I am frustrated with the state of affairs because my ambitious project (which does not rival Google in any way) has been ground to a halt before it even started. :(
I might make a small fuss on their forum, but I doubt I'll have any hope of changing things.