MILPITAS, Calif., Feb. 11, 2016 /PRNewswire/ -- Eyefluence, the creator and leader of eye-interaction technology for augmented reality (AR), virtual reality (VR) and mixed reality (MR) devices, has begun demoing head mounted display (HMD) devices controlled solely by eye movements to a small group of users. A handful of thought leaders in the HMD space have experienced a series of applications that illustrate the future of interaction among computers, information and people with eye-interaction. In order for HMDs to realize their game-changing potential, they require a truly intuitive user interface with low-power, robust eye-tracking that enables users to interact with their eyes quickly, easily, and naturally - something that has not existed until now.
Eyefluence's vision-driven iUiTM interaction model is the first to go beyond traditional eye-tracking, harnessing natural eye movement and intent to enable users to do anything they can do with a finger on a smartphone or tablet with their eyes on smart glasses, but faster and easier.
As the next evolution in human-computer interaction, Eyefluence's technology will transform the way we work, play, think and connect while driving adoption of AR and VR headsets for enterprise, industrial, and consumer applications.
Following a two-minute tutorial, users instantly navigate and interact with consumer and enterprise applications designed for AR, VR and MR environments using only their eyes. With interaction times measured in tens of milliseconds, the power of eye-interaction is apparent, and users comment that the system feels like it's reading their minds. The experiences, demonstrated on ODG's R6 Smartglasses and Oculus's DK2 headset, both retrofitted with Eyefluence's eye-interaction technology, include:
- Texting and SMS messaging with your eyes
- Accessing healthcare data on patients including electronic medical records, X-rays, etc. secured by continuous biometric identification and authentication
- Perusing weather and travel information
- Browsing, zooming and sharing photos
- Purchasing anything you see in the real world in real time with real orders placed
- Searching and spinning a 3D globe
- Panning and zooming to find Waldo
- Whacking moles with your head, then with your eyes to internalize the speed of eye-interaction
- Exploring and interacting with a virtual room full of information screens as fast as you can think and look
"In more than two decades of developing interaction models for platforms adopted by hundreds of millions of people, including the LeapFrog LeapPad and Livescribe Smartpen, Eyefluence's iUi™ is the most natural, intuitive, and easily-learned method of human-computer interaction I've seen," said Eyefluence Founder, CEO and serial inventor Jim Marggraff. "Users simply think and look, without waiting or winking, jabbing or poking, pointing or clicking, to perform actions and communicate more quickly than researchers thought was humanly possible. This is a pivotal moment for our company, and for the HMD industry as a whole, that will accelerate the adoption of AR and VR hardware and experiences, with our technology deployed in forthcoming headsets. All HMDs are fundamentally incomplete without eye-interaction, and all will be enabled with eye-interaction technology in the future."
Eyefluence is currently partnering with product development teams within leading consumer electronics companies and HMD manufacturers to provide an entire eye-interaction solution for next-generation VR, AR and MR devices, including hardware design, licensing of a proprietary suite of robust eye-tracking algorithms, and a groundbreaking vision-driven iUiTM interaction model with an SDK and APIs.
As the lead strategic investor in Eyefluence's Series B funding round, Motorola Solutions sees potential for eye-interaction to drive a seismic shift in the way people interact with technology. Motorola Solutions is working with Eyefluence on integrating eye-interaction into its innovative "smart public safety" applications to create safer cities around the world.
"Imagine if police officers could get information on an unfolding crime scene without visibly moving a muscle," said Paul Steinberg, chief technology officer, Motorola Solutions. "Instead, they would use only the motion of their eyes behind glasses, leaving their heads up and their hands free to manage the scene and take quick action."
Eyefluence is currently engaged in development with leading Fortune 500 companies as well as emerging companies developing AR, VR, and MR hardware and experiences.
For more information, please visit www.eyefluence.com.
Eyefluence transforms intent into action through your eyes. The company is led by successful serial entrepreneurs Jim Marggraff and David Stiehr and includes a cross-disciplinary team of experts in UX design, computer vision, image processing, optics, physics, math, artificial intelligence and machine learning as well as electrical engineering, mechanical engineering, industrial design, and computer science. Eyefluence's technology is based on an IP portfolio with more than 30 patents granted or pending. The company is engaged in development with leading Fortune 500 and other emerging companies and device manufacturers that are working to accelerate the wide adoption of smart glasses, including VR, AR and MR headsets for enterprise, industrial, government, education, and consumer applications. For more information, please visit www.eyefluence.com and follow us at @Eyefluence.