Computers Will Marry Human Senses with Big Data: IBM

IBM has offered a prediction for how computers will evolve over the next five years: they’re going to become more like you. They’ll increasingly adopt facsimiles of human senses—including taste and hearing—and use that data in conjunction with enormous databases, the better to provide more sophisticated insights into the world around us.

Does that sound like far-fetched science fiction? Sure. But ultra-sophisticated systems such as IBM’s Watson already represent a turning point in how people interact with machines. “While Watson can understand all manner of things and learns from its interactions with data and humans,” Bernard Meyerson, IBM’s chief innovation officer, wrote in a Dec. 17 posting, “it is just a first step into a new era of computing that’s going to produce machines that are as distinct from today’s computers as those computers are from the mechanical tabulating devices that preceded them.”

Emerging technologies will help transform how people interact with systems such as Watson. “One of the most intriguing aspects of this shift is our ability to give machines some of the capabilities of the right side of the human brain,” Meyerson added. “New technologies make it possible for machines to mimic and augment the senses. Today, we see the beginnings of sensing machines in self-parking cars and biometric security–and the future is wide open.”

To that end, IBM’s researchers believe that computers, over the next five years, will increasingly mimic touch, sight, hearing, taste, and smell:

Touch: Screens on smartphones and other mobile devices could imitate texture, allowing people to, say, “reach through” the display and “touch” clothing or other materials. Indeed, technology is already making steps in this direction: at last year’s CEATEC conference in Tokyo, at least one vendor demonstrated technology that could alter touch-screen feedback into different textures.

Sight: Computer systems will become more adept at recognizing visual patterns, opening up the fields of biometric security and healthcare IT. A properly configured system could scan an MRI, for example, and compare any microscopic abnormalities to a database of patient history, to better alert a physician to early signs of a disease.

Hearing: More sophisticated sensors will allow machines to hear the world with deeper comprehension and clarity, which could significantly advance voice recognition and other technology subfields.

Taste: This one doesn’t mean dipping your iPhone 8s into a bowl of dip and expecting to get the recipe; instead, IBM’s researchers believe that systems will be able to use algorithms to perfect nutrition regimens and recipes.

Smell: This falls into the “more sophisticated sensors” category: detectors capable of reading chemicals or molecules in the air, combined with a database, could give workers the ability to pinpoint and solve problems related to sanitation systems or even security.

Other researchers and executives have predicted that our data systems will become much more sophisticated at interacting with the larger world. In his predictions for 2013, for example, Information Builders CEO Gerry D. Cohen suggested that voice recognition will play a bigger role in business intelligence applications over the next several quarters.

However, as pointed out by a number of pundits in the past, even systems as sophisticated at Watson are only as good as their human trainers, who are necessary to translate, encode, schematize, and plot out the information that eventually finds its way into any system. No matter how sophisticated our computers become, they’ll still need humans—at least in the short term.

 

Image: Ociacia/Shutterstock.com

Post a Comment

Your email address will not be published.