IBM believes technology’s future lies in cognitive computing, which essentially means making computers think more like humans do. To IBM, that includes giving computers sensors that enable it to touch, see, hear, taste and smell – sensory input as one more piece in the puzzle to help solve problems.
IBM’s Five in Five
IBM’s progress toward cognitive computing is seen in the company’s annual end-of-year predictions. Rather than its usual practice of prognosticating on where five technologies will be in five years, this year’s Five in Five focuses on innovations that make it possible for computers to experience each of the five senses.
The projections mark the very beginning of what will be a long journey toward cognitive computing. The first step in building machines ablet to behave, think and interact like humans is to give them the same sensory abilities. That way computers can understand their environment, learn from it and act upon it. For example, if a robot could hear a train’s whistle and feel the vibration on the tracks, it might be able to figure out that a locomotive is coming and get out of the way.
The Five Senses
Touch: IBM also expects smartphones and tablets to communicate using haptics, nonverbal communication that enables people to experience how an object feels. Haptic feedback is already use to for many things, including to provide tactile sensation when typing on a glass keyboard, but that’s only the beginning. Eventually, devices pointed to an ecommerce site could vibrate to simulate the feel of a fabric’s weave, for example. (Click on the images to download the infographics.)
Sight: Vision will get an upgrade, too. IBM believes computers will be able to identify images and understand what they mean without the use of tags. This will lead to systems that can help doctors analyze X-ray pictures, magnetic resonance imaging (MRI) machine, ultrasound or computerized tomography scans.
Hearing: There will also be improvements in computers’ ability to hear and understand sound. Greater sensitivity to sound pressure, vibrations and waves could lead to more-accurate landslide warnings, for example.
Taste: Computers with virtual taste buds will be able to calculate flavor, according to IBM, helping chefs improve recipes or create new ones. The systems will break down ingredients to their respective chemicals and calculate their interactions with neural sensors in a person’s tongue and nose.
Smell: And, finally, according to IBM, computers will have an acute sense of smell in order to diagnose from a person’s breath a coming cold, liver and kidney disorders, diabetes and tuberculosis. Similar to how a Breathalyzer detects alcohol, the computer will be able to check for molecular biomarkers pointing to diseases.
IBM believes it can blend the sensory innovations in computing with mobile devices, cloud computing and social media to create, “an unbounded set of possibilities in terms of what we can do,” Kerrie Holley, an IBM research fellow, told ReadWrite.
How We’ll Use Cognitive Computing
In time, cognitive computing will be able to do what people don’t do well, such as understand the interactions of changing elements in huge systems. Examples include the global economy or weather patterns. With the help of a thinking, sensory-aware machine, we’ll be able to cut through the complexity of these systems, helping us make more-accurate predictions and anticipate the consequences of particular actions.
In addition, cognitive systems can help us separate our personal prejudices and egos from the facts in trying to solve a problem.
“The machines will be more rational and analytic,” Bernard Meyerson, chief innovation officer at IBM, said in a blog post. “We’ll provide the judgment, empathy, moral compass and creativity.”
Think of it this way: The human-digital relationship will mirror the extraordinarily effective partnership enjoyed by Captain Kirk and Mr. Spock on Star Trek.