- 18 Hot IT Certifications for 2014
- CIOs Opting for IT Contractors Over Hiring Full-Time Staff
- 12 Best Free iOS 7 Holiday Shopping Apps
- For CMOs Big Data Can Lead to Big Profits
Page 2 of 3
Stop, look. Listen: IBM thinks that by 2017 or so a distributed system of what it calls "clever sensors" will detect elements of sound such as sound pressure, vibrations and sound waves at different frequencies. The system will interpret these inputs to predict when trees will fall in a forest or when a landslide is imminent. Such a system will "listen" to our surroundings and measure movements, or the stress in a material, to warn of danger, IBM says.
Raw sounds will be detected by sensors, much like the human brain. A system that receives this data will take into account other "modalities," such as visual or tactile information, and classify and interpret the sounds based on what it has learned. When new sounds are detected, the system will form conclusions based on previous knowledge and the ability to recognize patterns, IBM says. By learning about emotion and being able to sense mood, systems will pinpoint aspects of a conversation and analyze pitch, tone and hesitancy to help us have more productive dialogues that could improve customer call center interactions, or allow us to seamlessly interact with different cultures.
IBM goes so far as to say that "baby talk" will be understood as a language -- telling parents or doctors what infants are trying to communicate.
Good taste: IBM said its researchers are developing a computing system that experiences flavor, to be used with chefs to create the most tasty and novel recipes. Such a system breaks down ingredients to their molecular level and blend the chemistry of food compounds with the psychology behind what flavors and smells humans prefer. By comparing this with millions of recipes, the system will be able to create new flavor combinations that pair, for example, roasted chestnuts with other foods such as cooked beetroot, fresh caviar, and dry-cured ham, IBM says.
Specifically, the computer will be able to use algorithms to determine the precise chemical structure of food and why people like certain tastes. These algorithms will examine how chemicals interact with each other, the molecular complexity of flavor compounds and their bonding structure, and use that information, together with models of perception to predict the taste appeal of flavors.
You smell funny: IBM says that during the next five years, tiny sensors embedded in your computer or cell phone will detect if you're coming down with a cold or other illness. By analyzing odors, biomarkers and thousands of molecules in someone's breath, doctors will have help diagnosing and monitoring the onset of ailments such as liver and kidney disorders, asthma, diabetes and epilepsy by detecting which odors are normal and which are not. Due to advances in sensor and communication technologies in combination with deep learning systems, sensors can measure data in places never thought possible. For example, computer systems can be used in agriculture to "smell" or analyze the soil condition of crops.
The point isn't to replicate human brains Meyerson says and this isn't about replacing human thinking with machine thinking. "Rather, in the era of cognitive systems, humans and machines will collaborate to produce better results-each bringing their own superior skills to the partnership. The machines will be more rational and analytic. We'll provide the judgment, empathy, morale compass and creativity."