Catalyst - 5:45 pm on Apr 15, 2012 (gmt 0) [edited by: anallawalla at 7:52 pm (utc) on Apr 15, 2012]
Most of my pet peeves as anallawalla mentioned, involve the data scraping mechanism.
DUPES and MERGES are 2 of the biggest problems and both can be attributed to the scraping algo which is sometimes wrong or sometimes overly aggressive. PLUS dupes and merges are inter-related. If G turns the dial one way you may get fewer dupes but more merges, the other way clears up some merge problems but creates more dupes.
OVER-WRITING USER DATA - Often done by the algo and again often due to scraping and putting MORE TRUST in 3rd party data than what the owner himself added to the dashboard.
I wish there was an option for proactively managed Place pages to be able to lock down their data so the algo, map editors and users could not keep making changes and messing it up.
BUT in Google's defense I also realize the other side. Here's a really good post that illustrates WHY the algo has to step in and try to fix stuff. Most business owners don't get their data right, keep it up-to-date or don't read the guidelines and break the rules. https://groups.google.com/a/googleproductforums.com/forum/#!msg/business/z__VFWjEp7k/6FnQIevvXeoJ (Note to mods, I think it's OK to link to the G forum, if not, sorry and please remove.)
I could go on and maybe will add more when I have time.
But for now let me just say, even though I see TONS of bugs and problems, the more I learn about how the back end really works, the more I work with upper management at Google Places on policy/procedure related issues, the more I come to realize that trying to organize millions and millions of business records, billions of reviews and thwart TONS of spammers, scammers and hijackers is just an incredibly tough job!
Lots of things are in the works right now to try to make it all work better, but there is still a long way to go.
[edited by: anallawalla at 7:52 pm (utc) on Apr 15, 2012]