There are multiple sources for the above press release, which, by describing Eyefluence as an eye-tracking company, curiously downplay the eye-interaction capabilities of the Eyefluence technology.
This acquisition should enable a big leap forward in Google's virtual reality efforts, incorporating the technology in VR "HMDs" (head-mounted-displays), possibly supplanting the traditional mouse and the keyboard, as well as more recent hand gesture controls.
I'm gravitating toward TechCrunch and the Eyefluence site itself to provide best descriptions of what Eyefluence is doing. Eyefluence is so eloquent about its goals and accomplishments that I'm going to cite them first...
Eyefluence - Eye-Interaction Technology http://eyefluence.com/what-we-do/ [eyefluence.com]
An Eyefluence video describes our current limited state of the art in next-gen devices....
The way we interact with our devices has evolved, yet next generation devices, which use limited methods of control, like head, hands, and voice, are incomplete. These interactions must evolve as well....
Eyefluence technology works with human biology to enable wearers of head-mounted displays to communicate with their device and the world through eye interaction, transforming intent into action simply through your eyes. It's the critical next step will drive HMDs towards mass adoption.
One comment that jumped out is that
"the human eyes are the fastest-moving body part"... consistent with Google's penchant for speed... and perhaps, in addition to other benefits, the technology is a good fit with some aspects of Google Glass.
TechCrunch has apparently been following Eyefluence for a while. Here's its article on the acquisition...
Google buys Eyefluence eye-tracking startup Oct 24, 2016 - by Lucas Matney [
techcrunch.com...]
...Eyefluence enable users wearing head-mounted virtual reality or augmented glasses to use their eyes as a mouse and making selections only with their eye movements. Eye-tracking has other more technical use cases like foveated rendering which allows high-density displays to selectively choose areas of the screen to display images at lower-resolution based on where your focus actually is on the display.
The article includes a 21-minute video featuring Matney interviewing Leap Motion's Michael Buckwald and Jim Margraff of Eyefluence, at Disrupt SF in September 2016, on topics including "the future of human-computer interaction".