It is perfectly paired with small movement motions
Although a handful of small phones still exist, it’s no secret that the entire industry has moved into a “bigger is always better” mindset. A large screen can be handy when editing documents on the go or watching a downloaded movie on an airplane, but you need two hands to tap each part of the screen. Researchers at Carnegie Mellon University’s Future Interfaces Group have created a potential solution to this ever-growing problem.
Unlike most accessibility projects, this particular experiment started with the goal of making smartphone control easier for everyone. According to a press release from CMU’s Computer Science department (via TechCrunch), the team began by asking if there was “a more natural mechanism to use to interact with the phone.” Obviously, your finger makes the most sense – you interact directly with icons and controls on the screen. Every extra input device we’ve seen over the years as pens has worked the same way.
For Future Interfaces Group, the team decided to work on eye-tracking instead. It’s not a new experiment, but judging by the hands-on video released last fall, it’s one of the most successful to date. With EyeMU, the user can select and open notifications, revert to previous apps and even select specific images. These movements are paired with the flick of the phone itself, shaking it left and right, or raising it closer or farther away from your face. It’s better seen in action than described, so check out the clip below for a complete demo.
The biggest difference between this approach and older examples of the same technology comes down to restraint. The team knew that you can not have actions paired with every single glance – otherwise, how do you get something done? This is where the motion sensors come in and act as confirmation prompts every time you need to select, move or reject.
Clearly, this experiment is still very much a demo of what might come up in the future. Who knows – maybe the Pixel 11 will be powered by similar technology.
Read Next
About the author