Skip Links

IBM: In the next 5 years computers will learn, mimic the human senses

IBM researchers say cognitive systems will be capable of learning from their interactions with data and humans

By , Network World
December 17, 2012 12:14 PM ET

Network World - IBM today issued its seventh annual look at what Big Blue researchers think will be the five biggest technologies for the next five years. In past prediction packages known as "IBM 5 in 5" the company has had some success in predicting the future of password protection, telemedicine and nanotechnology.

The IBM 5 in 5 research is based collective trends as well as emerging technologies from IBM's R&D labs around the world. This year's research points to the development of what IBM calls a new generation of machines that will learn, adapt, sense and begin to experience the world as humans do through hearing, sight, smell, touch and taste.

NEWS: The weirdest, wackiest and coolest sci/tech stories of 2012 

MORE: 25 tech touchstones of the past 25 years 

MORE: The year in madly cool robots

"Just as the human brain relies on interacting with the world using multiple senses, by bringing combinations of these breakthroughs together, cognitive systems will bring even greater value and insights, helping us solve some of the most complicated challenges," writes Bernie Meyerson, IBM fellow and VP of innovation.

Your computer will reach out and touch somebody: According to IBM, in five years, industries such as retail will be transformed by the ability to "touch" a product through your mobile device. IBM says its scientists are developing applications for the retail, healthcare and other sectors using haptic, infrared and pressure sensitive technologies to simulate touch, such as the texture and weave of a fabric -- as a shopper brushes a finger over the image of the item on a device screen. Utilizing the vibration capabilities of the phone, every object will have a unique set of vibration patterns that represents the touch experience: short fast patterns, or longer and stronger strings of vibrations. The vibration pattern will differentiate silk from linen or cotton, helping simulate the physical sensation of actually touching the material, IBM says.

Can you see me now? Within the next five years, IBM researchers think computers will not only be able to look at images, but help us understand the 500 billion photos we're taking every year (that's about 78 photos for each person on the planet). In the future, "brain-like" capabilities will let computers analyze features such as color, texture patterns or edge information and extract insights from visual media.

One of the challenges of getting computers to "see," is that traditional programming can't replicate something as complex as sight. But by taking a cognitive approach, and showing a computer thousands of examples of a particular scene, the computer can start to detect patterns that matter, whether it's in a scanned photograph uploaded to the web, or some video footage taken with a camera phone, IBM says.

Within five years, these capabilities will be put to work in healthcare by making sense out of massive volumes of medical information such as MRIs, CT scans, X-Rays and ultrasound images to capture information tailored to particular anatomy or pathologies. What is critical in these images can be subtle or invisible to the human eye and requires careful measurement. By being trained to discriminate what to look for in images -- such as differentiating healthy from diseased tissue -- and correlating that with patient records and scientific literature, systems that can "see" will help doctors detect medical problems with far greater speed and accuracy, IBM says.

Our Commenting Policies
Latest News
rssRss Feed
View more Latest News