Welcome to WebmasterWorld Guest from 188.8.131.52
...Instead of doing the research all over the web, wouldn’t it be great to see all the information about one place in...one place?
Starting today, you can do that on Place Pages for Google Maps. A Place Page is a webpage for every place in the world, organizing all the relevant information about it. By every place, we really mean *every* place — there are Place Pages for businesses, points of interest, transit stations, neighborhoods, landmarks and cities all over the world.
The Place Pages are far more informative and attractive than the previous tabbed arrangement that Google Maps has been displaying when you clicked "more info".
A "webpage for every place in the world" is also an incredibly ambitious undertaking, and... as the Place Pages data is aggregated from elsewhere on the web and organized into well-structured documents from algorithmically chosen images, articles, reviews, and other information, the Place Pages of more popular places are likely to be justifiably considered threatening to many types of sites.
they are saying nothing new. they are a bit like re-runs on the TV. they just take stuff that we've said already and re-use it. i don't know what's so innovative about that. what is innovative about these new place pages? it's just billions and billions of pages of rehash. the pages aren't even pretty. they are no different to all these directory sites that you have been talking about. the only difference is this: instead of the businesses signing up and entering their own details themselves, google just goes ahead and scrapes the lot from other sites to save time. then includes a load of reviews alongside from people who might be completely wrong. and that is supposed to be the definitive one-stop page for "everywhere in the world".
the further google goes into this 'providing content' business, the more people will realise where the content is actually coming from -- us. provided free of charge.
all it would take to bring the company to it's knees would be for webmasters to go "on strike" for a month, and block their bots with robots.txt. with no data to collate, they would have no content to print.
your argument that we shouldn't "base our sites and income on automated techniques that the big guys can do better" is all well and good, but you are conveniently forgetting that in order to find our sites in the first place, they've got to go through google. so all google has to do between the search and the visit is stick a load of their own content in front of their faces. that is what this is all about -- hijacking the visit. people used to go to google, search and leave within ten seconds. but now google would like them all to come, search and stay. and the only way they can do that is to provide their own content. but they don't have their own content writers. so they nick it all from us free of charge.
this is like microsoft all over again. people needed programs on their PCs, and they need an operating system to run them. if they people who made the operating system provided their own programs free of charge, then the program makers would never get the chance to sell their own stuff. so the courts stepped in and made microsoft split them off.
likewise... people need info from websites, and they need search engines to find it. if they people who make the search engines provide their own info free of charge, then the website owners will never get the chance to show their stuff. only this time it's worse, because at least microsoft wrote their OWN programs. google isn't even doing that -- they are nicking all our info, watering it down and mixing it up with other stuff and then decorating the page with their own ads.
eventually the courts will have to step in and make google split the search engine from the content, i reckon.
Really like the idea of a webmaster strike that you describe. Too bad it does not work, because the required unity will never be achieved within the webmaster community.
joined:July 3, 2008
you are conveniently forgetting that in order to find our sites in the first place, they've got to go through google. so all google has to do between the search and the visit is stick a load of their own content in front of their faces. that is what this is all about -- hijacking the visit.
But they aren't "hijacking the visit," any more than displaying Google Maps, Google News, YouTube, or Image Search results on a SERP is "hijacking the visit." Users who don't want those results filter them out and move down the SERP to where the non-Google search results are.
Granted, if you've got a site that's a YouTube clone, you might be hurt by YouTube results on a Google SERP. If you've got an image-directory site built around image thumbnails, maybe Google Image Search is a threat. If you're a news aggregator, having Google News at the top of Google's SERPS isn't good news for you. But if you're providing the kind of content that doesn't lend itself to automated aggregation, scraping, or user-review gathering, a service or product like Google Places is likely to be what CommanderW described: "a gussied-up results page which actually provides a new and more targeted venue for my site to be indexed and listed in."
And now an aside: What's the point of complaining every time Gogle, Yahoo, or MSN introduces a new product or service? Voicing your resentment won't make it go away. It would be more productive to figure out whether the new product or service is likely to help or hurt you (if at affects you at all), and--if it does hurt you--how you should respond. I've offered my practical suggestion: Don't build a business around automated techniques that the Googles, Yahoos, MSNs, Facebooks, YouTubes, etc. of the world can do better (and scale better) than you can.
Users who don't want those results filter them out and move down the SERP to where the non-Google search results are
And 16x9 widescreen means there is even more G and even less serps above the fold ..
What's the point of complaining every time Gogle, Yahoo, or MSN introduces a new product or service? Voicing your resentment won't make it go away.
diversify from day one is obvious as regards depending on adsense etc ..
but G scraping content and putting it above the fold ( and sometimes their stuff is all there is above the fold ) is something else ..If they keep it up they'll take over all of page 1 and not ID the stuff as their own ( how many average people actually know that adwords are ads ..not many ..they think it's all the best and most accurate and most deserving ..and not paid for and all serps are organic ..how many even know what organic serps are ..they think the web is goog ) ..The day real serps begin on page 2 you will be as screwed as anyone else ..already they begin on page 1 at around the old number 4 or 5 slot ..and sometimes 6 items on a page one of 10 are links to goog properties ..built on scraped content ..or maps or street view ..
I've offered my practical suggestion: Don't build a business around automated techniques that the Googles, Yahoos, MSNs, Facebooks, YouTubes, etc. of the world can do better (and scale better) than you can
alright... what if you've got a site that sells tickets to music gigs (for example). google can't automate that, because they don't sell tickets.
but what if the user types in "music gigs at blah blah stadium". google can still intercede with some automated content -- by listing all the gigs that are taking place at that venue. the user will naturally look at it, because it is placed right at the top of the serps, and if google sticks an adwords ad promoting tickets to that gig, then you are screwed. they have steered the visitor through a site that has made them money.
it is not in google's interest to send a visitor through a site in the serps... they want to send them through an adwords ad. and by sticking a load of their own content at the top of the page (which they've scraped from us), which lures the user away from the serps, they can easily do that by placing ads on the second page.
if you think that is unlikely to happen, then try searching for cinema listings in a big city -- google is already doing something that is just a short step away from exactly that. they provide their own listings at the top of the page, which lures the user onto a second page without ever seeing the serps.
[edited by: londrum at 7:00 pm (utc) on Sep. 26, 2009]
What's the point of complaining every time Gogle, Yahoo, or MSN introduces a new product or service?
First, I'd bet that if Google introduced a content service with scraped information for trips to europe or elbonia, you'd suddenly start to complain as well.
Second, I heard quite few complaints when Microsoft launched Bing. Recently, I see a lot of criticism for Google.
Third, Google so powerful in financial terms that they can disrupt almost ANY business. They just throw resources at it and make the competition go away. The rise of Youtube killed dozens of competitors in the video space. But is it a successful business? No. It loses money. Google subsidizes Youtube, and this subsidy makes it impossible for most competitors to stay alive.
joined:July 3, 2008
The day real serps begin on page 2 you will be as screwed as anyone else ...
I don't think the Google Search team are foolish enough to destroy their core product by replacing every listing on page 1 of Google Search with in-house Google results.
Google isn't going to abandon Place Pages, Maps, News, Maps, Image Search, or its other products just because a few people complain about them on Webmaster World. Hoping that Google will revert to unadorned 1998-style Web text search is neither a survival strategy nor constructive advice.
I don't think the Google Search team are foolish enough to
things I've read on this very forum in last 6-7 years.
1. I don't think Google are foolish enough to start putting ads on the SERPS page
2. I don't think Google are foolish enough to start artifically tampering with the SERPS to get their own properties at the top (youtube...and wikipedia too, even though it's not theirs)
3. I don't think Google are foolish enough to accept scraper sites into AdSense (they did, then became one themselves)
never under-estimate the power of money to create foolishness.
joined:July 3, 2008
Ultimately, whether a site owner feels threatened by Google Place Pages is likely to depend on whether the site owner believes Google can do a better job than he or she can. Some site owners will (like CommanderW) regard Google Place Pages as a new source of referrals; others will regard Google Place Pages as a direct competitor.
This certainly has the potential to take away some traffic - and any conversion steps that the website might offer are not seen by the searcher. If the place's actual website also generates ad revenue, there's another potential loss. Then then there's control of the information and its accuracy - that's kind of hijacked. For some "places" this may not be a concern, but for others it certainly might.
At the same time, there may well be positive opportunities in Place Pages. For instance, the user-created category Maps, such as the California Dessert map that the blog announcement shows off.
I have no crystal ball that can judge the NET effect, but I do understand the concern. As a search user, I think I'll appreciate the new feature. But working with business websites for a local search presence? The jury is still out.
I must say 2 things.
First, most of the links on the example pages I checked do go to other websites in the same manner as they do from a normal search page.
The business profiled does have it's own website link on the 'places pages', in at least 2 places.
The link to the profiled business website is right under the address and phone number at the top of the page, and also at the very bottom of the page.
On the example google maps places' page for the Tartine bakery [maps.google.com] page, the "menu" links goes to zagat menu.
the photo thumbs go to the website where the original photo resides.
Second - But, drilling down and poking around, I have in fact found what I would consider 'scraperage'.
On the example, Tartine Bakery, the "more details" link does lead to some, I'm afraid. tartine more details page [maps.google.com]. Here we find descriptions of and lists of the seating arrangements, hours, directions and much else, all lifted right off other websites (though they are credited and linked to elsewhere).
the example link to Zurich Hauptbahnhof [maps.google.com] actually has some of the train schedules posted. That's far beyond search results i quess.
Another example, the places page for san francisco [google.com] looks about as good as many city guides. all the top corny boring tourist attractions are listed. All w/ an image (I only clicked one. It did lead to a business website) and a snippet (all from wikipedia:).
Futher beyond search results, there is a "write a review" link that leads to login page for google maps account. Exploring this i find that google maps pages all have 'user content" links which lead to reviews written by people with google maps accounts, as well as invitations to write a review.
However, all of this is not in the regular serps, but on google maps, and is buried in 2, 3, and 4 level deep links to one cumbersome code-bloated page after another, that each takes a long time to load - I'm not worried yet:)
In summary - Give the pages and links a closer look. some posters seem overly alarmed and pessimistic. On the other hand, there is some of googles own 'user generated content', and bona fide scraping going on deep in the 'places pages'. But many of the sites 'scraped' look like they've all gotten their 'content' from somewhere else anyhow. and so far, it is all coming from a handful of sites that show up over and over (Though that's certainly not a good thing for the rest of us, either).
From my perspective, it can still be a plus. While it doesn't do much good to have my site listed in their 'sites' list of other places that have content about the destination site the user has already found on google maps, I really, really would like to have a linked image from my site appear in their 'photos' listings. I quess I'm going to have to fill all those empty 'alt' and 'title' tags after all:)