|Google Makes "Smarter Best Guesses" On Image Search|
| 7:16 pm on Jul 2, 2012 (gmt 0)|
Google Makes "Smarter Best Guesses" On Image Search [insidesearch.blogspot.co.uk]
Smarter best guesses
When you search with an image, we use computer vision to try and figure out what the image represents, and then show you a “Best guess for this image.”
|Knowledge Graph results |
With the recent launch of the Knowledge Graph, Google is starting to understand the world the way people do. Instead of treating webpages as strings of letters like “dog” or “kitten,” we can understand the concepts behind these words. Search by Image now uses the Knowledge Graph: if you search with an image that we’re able to recognize, you may see an extra panel of information along with your normal search results so you can learn more.
|More comprehensive search results |
Finding more information about an image is the most common use of Search by Image. Very often this information is found on websites that contain either your image or images that look like it. We’ve made recent improvements to our freshness, so when photos of major news stories start appearing on the Internet, you can often find the news stories associated with those photos within minutes of the stories being posted. We’ve also expanded our index so you can find more sites that contain your image and information related to it.
| 7:45 pm on Jul 2, 2012 (gmt 0)|
Indeed, it works very well. If you do a search by image and upload a photo of a dog breed, Google understands that it is a dog and finds/guesses it breed, and shows photos of that breed of dogs.
| 8:19 pm on Jul 2, 2012 (gmt 0)|
This indicates that Google has accumulated and analyzed close to a critical mass of image data along with associated meta-data, and that the combination of image and meta data is further refining Google's "Search by Image" capabilities.
Image "edge" point data and pattern matching are now organized and extensive enough that Google can take a good stab at subject. And once subject/location/context is identified, image searching becomes significantly more efficient and nuanced. It becomes much more like site: search (pun intended here) than like, say, a global search of organic results.
This is also clearly intended to lead into mobile search via phone cameras. On geo-relevant, real-time subjects, phones will send location, so the data will eventually all mesh and further enhance the system.
In the very early demos of the Knowledge Graph I saw a feature that's no longer appearing... several of Taj Mahal (the building) images shown would knit together and allow you to move to and from and around the subject. It was a very graphic illustration of how image data can be triangulated and used to triangulate with other sets of data (as with the Knowledge Graph... or, by extension, even with user behavior). My guess is that Google is waiting for a more opportune moment to announce the feature, possibly with a larger data-set.
With regard to images on web pages, btw, image scraping without detection is now likely to be much harder. It's more computationally efficient to make extremely fine distinctions (and matches) among a group of images when they're limited by subject. Note the "Pages that include matching images" results in the search by image serps pages. It's been a while since I've done a search by image, but, last time I did, I didn't notice this kind of grouping. The matching images (in some cases with watermarks cut off) seem to be precisely matching images.