Main image of article Gestures Are the Next Big Thing in UI

 For the most part, user interfaces today require users themselves to do all the work. But as technology evolves to allow people to use gestures alone, that’s changing. The result: Working with technology is becoming more natural and more effective. For example, in the near future it may be possible to refuse a call simply by looking at a ringing phone and shaking your head. If you’ve stopped paying attention or left the room, the app might pause to await your return.

More Than Finger Swipes

Since the first iPhone appeared in 2007, gestures have played an increasing role in user interface design. For example, finger swipes on a screen or touchpad are common elements in many mobile UIs. Then there are remote gestures, those where users don’t physically touch the input device. Remote gestures tend to be body movements rather than just a finger or hand motion. The Wii game console, introduced in 2006, was a remote gesture pioneer, followed by Microsoft’s Kinect in 2010. The Wii is well known for its balance board accessory that allows users to control games and fitness applications through body motions and weight shifts. “Gestures have a couple advantages and one big disadvantage,” says Rob Enderle, principal analyst at the Enderle Group in San Jose, Calif. “The advantages have to do with doing things more quickly. You can with a gesture tell a computer to do any variety of things that might otherwise require a complex series of commands to execute.”

What It Means for Your Job

Software and hardware engineers should consider how this fledgling shift to gestures might influence their career path. While most workers won’t be developing new hardware and software from scratch, as controller-maker Leap Motion has done, gesture technology will require creative thinking to make users’ lives easier. “The power of gesture control comes from incredible intuitiveness of using real life gestures to communicate to a device. Nothing beats that,” says Elia Kanaki, owner of Rossul Design, a Toronto-based UI/UX design firm. However, as promising as the technology may be, it may not be adopted by the entire tech world, at least not in the short term. “Gestures are interesting in PCs since [they add] a new dimension to the UI, but I don't see them as a mainstream UI feature for traditional PCs,” says analyst Tim Bajarin, CEO of San Jose-based analysis firm Creative Strategies. “I believe its biggest impact will be on the TV user experience. We need better ways to deal with smart TV navigation from an eight to 10 foot reach. If someone can nail precise gesturing for TV navigation, it could have a big impact on the future of smart TV's. And, of course, it has great potential for gaming.” Analyst Enderle notes that gestures could help anyone control a large display without requiring a touchscreen. “On a large monitor, gestures are actually better because you tend to sit too far from the monitor to make touch convenient,” he observes.

‘Devices Learn Us’

“Remote and motion gestures are coming to mobile devices and it's going to be big,” Kanaki predicts. “Gesture-based controls allow devices to be smarter and react to our behavior, stopping a movie when we look away from the screen, for instance. There is no learning curve for that. Devices learn us.” Kanaki imagines that a gesture such as looking at the phone and slightly shaking your head could refuse a call. “We're only at the beginning of what can happen with gesture UI,” he says. On the other hand, Bajarin sees the opportunity for gesture UIs to be limited in the mobile world. “It’s cross platform for games and TV/entertainment but doesn’t work well with mobile and probably never will,” he believes. “With mobile it can enable some apps, but it’s not good for the majority of apps out there.” And for all their promise, gesture-based UIs might confuse some users, especially those who are used to touchscreens. “The one big problem is existing computer users have to learn them, and much like learning another language, they may find this tedious and not do it,” Enderle suggests. “So they work best for folks that are just learning and initially learn them when they start with a device and don't translate well with people who already think themselves proficient.”

Jobs in Gesture

While user interfaces are hot right now, the emergence of user experience design is starting to overtake it, Kanaki says. “It's simply not enough anymore to just arrange UI elements or come up with a new gestures. Advanced hardware allows us to design rich experiences with optimized workflows, which is so much more than just UI.” In today's environment, he says, designing requires knowledge of psychology, human behavior, visual information processing, cognitive processes and related fields. “As the public continues to acquire the necessary skills to interact with their devices and is willing to try new ways of interaction, another ‘usability’ wave as we had with the Web six to eight years ago will show up. This will create a need for standardization, and it's only a couple of years away.” Image: U.S. Navy