IBM today said it would be installing its super-smart cognitive Watson supercomputer at Rensselaer Polytechnic Institute, making it one of the first university's to receive such a system.
With 15 terabytes of hard disk storage, the Watson system at Rensselaer will store more information than the system that famously won the Jeopardy! television show in 2011 and will let 20 users access its computing power at once. Rensselaer was one of eight universities that worked with IBM on the development of the initial version of Watson that competed on Jeopardy!. IBM and the state of New York are also partners of Rensselaer supercomputing center, The Computational Center for Nanotechnology Innovations.
The idea is that Rensselaer faculty and students will further sharpen Watson's reasoning and cognitive abilities, while broadening the volume, types, and sources of data Watson can draw upon to answer questions, IBM said. Rensselaer researchers will look for ways to use Watson for driving new applications in everything from better word recognition and algorithms to image handling, IBM said.
Since 2011, Watson has been deployed in a number of fields but has developed expertise primarily in healthcare, where IBM is collaborating with medical providers, hospitals and physicians to improve diagnosis accuracy and treatments. IBM Watson is also looking into financial applications.
IBM is all about the development of what it says will be a new generation of machines that Watson represents that will learn, adapt, sense and begin to experience the world as humans do through hearing, sight, smell, touch and taste.
In its recent future prediction study known as "IBM 5 in 5" the company computers will in five years, industries such as retail will be transformed by the ability to "touch" a product through your mobile device. IBM says its scientists are developing applications for the retail, healthcare and other sectors using haptic, infrared and pressure sensitive technologies to simulate touch, such as the texture and weave of a fabric -- as a shopper brushes a finger over the image of the item on a device screen. Utilizing the vibration capabilities of the phone, every object will have a unique set of vibration patterns that represents the touch experience: short fast patterns, or longer and stronger strings of vibrations. The vibration pattern will differentiate silk from linen or cotton, helping simulate the physical sensation of actually touching the material, IBM says.
IBM researchers also think computers will not only be able to look at images, but help us understand the 500 billion photos we're taking every year (that's about 78 photos for each person on the planet). In the future, "brain-like" capabilities will let computers analyze features such as color, texture patterns or edge information and extract insights from visual media.
One of the challenges of getting computers to "see," is that traditional programming can't replicate something as complex as sight. But by taking a cognitive approach, and showing a computer thousands of examples of a particular scene, the computer can start to detect patterns that matter, whether it's in a scanned photograph uploaded to the web, or some video footage taken with a camera phone, IBM says.
Within five years, these capabilities will be put to work in healthcare by making sense out of massive volumes of medical information such as MRIs, CT scans, X-Rays and ultrasound images to capture information tailored to particular anatomy or pathologies. What is critical in these images can be subtle or invisible to the human eye and requires careful measurement. By being trained to discriminate what to look for in images -- such as differentiating healthy from diseased tissue -- and correlating that with patient records and scientific literature, systems that can "see" will help doctors detect medical problems with far greater speed and accuracy, IBM says.
Check out these other hot stories: