Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: open
So in my example with the plumbers in Detroit. I want to get a list of the first 200 plumbers that show up in Google maps for Detroit with their names, addresses and phone numbers.
If I have to, I can manually type them all in but I'd prefer to automate it.
suggest you talk to a lawyer.
I'm trying to pre-populate a directory of business services.
Is it really a violation?
But consider this scenario:
You finish your site. It takes about 20 hours to copy down names and numbers. It took another 60 hours to call each business, confirm their number, and find out their hours of operation (to add value to the site). Within minutes of posting the site, a robot scrapes the site and the next day you discover a brand new local directory site online that contains the same data you posted.
How do we label the action of scraping your site and reposting the data?
An act of "saving time" or an act of theft?
I think that expression "imitation is the sincerest form of flattery" only goes so far -- you may disagree. ::shrugs:: Good luck with your project.
A business' name, address and phone number is readily available from a number of sources (yellow pages, online, etc.) Do you think that Yellow Pages or in this case Google Maps owns that information? For example, if I didn't automate acquiring the information and instead wrote it all down by hand and manually entered it, would that be theft?
If I were starting an online competitor to YellowPages.com (which is not what I'm doing), how would you suggest that I go about entering businesses? It seems silly to think that with all of the advances in technology that I would have to manually key-in everything.
As far as the appropriateness/morality of it, that's something you must decide for yourself. The truth is that someone had to do the work of collecting and organizing those facts for the sources you've mentioned. They should get to decide how their work is used. If I want the same facts, then I can collect them the same way they had to collect them.
I've built a local directory site before and, although I have the technical skill to do so, I didn't scrape Google Maps, Citysearch, or any other site. It would have been easy but it wouldn't have been right.
I used a combination of public records (most businesses are licensed by the state and that data is freely available) and manually/physically checking (driving around with a laptop). Was it harder? Yes. In the end did I have better data than the people I could have scraped from? Yes.
use any robot, spider, site search/retrieval application, or other device to retrieve or index any portion of Google services or collect information about users for any unauthorized purpose;
BUT, the Google Maps API offers exactly what you want. Although you can not collect the data to populate your database, you can just display the current data straight from Google which should serve your purpose.