New gaze-tracking tool lets you control your smartphone with your eyes
- United States
Researchers at Carnegie Mellon University's Human-Computer Interaction Institute (HCII) are developing a new tool that allows users to control their smartphones by combining gaze control and motion gestures.
Dubbed EyeMu, the new gaze-tracking tool enables mobile device users to interact with their screens without lifting a finger.
Andy Kong, a senior majoring in computer science and the lead author of the paper describing the findings, found commercial eye-tracking technologies costly, so he wrote a program that used a laptop's built-in camera to track the user's eyes, which in turn moved the cursor around the screen - an important early step toward EyeMU.
Kong and Karan Ahuja, a doctoral student in human-computer interaction, advanced that early prototype by using Google's Face Mesh tool to study the gaze patterns of users. In the next step, the researchers developed a gaze predictor that uses the smartphone's front-facing camera to lock in what the viewer is looking at and register it as the target.
By combining the gaze predictor with the smartphone's built-in motion sensors, the team made the tool more productive. For instance, users could look at a notification long enough to secure it as a target and flick the phone to the left to dismiss it or to the right to respond to the notification.
"Current phones only respond when we ask them for things, whether by speech, taps or button clicks," Kong said. "If the phone is widely used now, imagine how much more useful it would be if we could predict what the user wanted by analyzing gaze or other biometrics," said Kong.
The team's findings were presented by Kong, Ahuja, Chris Harrison, an associate professor in the HCII and director of the Future Interfaces Group at CMU, and Assistant Professor of HCII Mayank Goel at last year's International Conference on Multimodal Interaction.
More information can be found here.