IBM delivers amazement with Watson

desert road clouds
Credit: ©IDG Communications, Inc. Photo contributed by Matthew Mikaelian

Its not just winning Jeopardy that Watson can do. IBM rolls out some science fiction functionality.

RELATED TOPICS

IBM's Watson is an amazing beast. First bursting to prominence when it famously won the Jeopardy quiz show, Watson has gone on to more important and interesting tasks. At IBM's InterConnect event last week [disclosure - IBM contributed to my travel and expenses to attend the show], IBM announced a host of new functionality off the back of Watson for general developer use.

The new APIs, currently in beta trial, advance the "human-like" Watson attributes that developers can leverage to deliver rich functionality in their own applications. The new APIs are:

A deeper capability for Tone Analyzer designed to give users better insights about their own tone in a piece of text. Adding to its previous experimental understanding of nine traits across three tones – emotion (negative, cheerful, angry), social propensities (open, agreeable, conscientious) and writing style (analytical, confident, tentative) – Tone Analyzer now analyzes new emotions, including joy, disgust, fear, and sadness, as well as new social propensities, including extraversion and emotional range.

As well as these new emotions, Tone Analyzer is now analyzing entire sentences and not individual words as before - this should help with a more nuanced analysis of tone.

In the keynote announcing these APIs, IBM customer Connectidy showcased a solution designed to help understand and better articulate messages in a dating context. The idea being that potential partners can gain a deeper understanding of the context and subtext of messages from their potential significant other. While somewhat creepy, and the direct opposite of romance and mystery, the Connectidy example was certainly interesting.

Also being released was an API for Emotion Analysis that will utilize natural language processing to analyze external content beyond a simple positive/negative sentiment analysis. Using this API content such as customer reviews, surveys, and social media posts can be more deeply assessed.

Finally IBM is releasing an API for Visual Recognition that can be trained to recognize and classify images based on training material. Instead of simply tagging images with a fixed set of classifiers or generic terms, Visual Recognition allows developers to train Watson around custom classifiers for images – the same way users can teach Watson natural language classification – and build apps that visually identify unique concepts and ideas. This means that Visual Recognition is now customizable with results tailored to each user’s specific needs. For example, a retailer might create a tag specific to a style of its pants in the new spring line so it can identify when an image appears in social media of someone wearing those pants.

In addition to these new beta APIs, IBM is incorporating the emotional IQ functionality into its text to speech API and releasing it generally. What this means in practice is that computers can now respond vocally to the tone, sentiment, and inflection of the user. A use case for this discussed at the event was in call centers where automatic voice recognition systems can now prompt for a response based on a caller's tone and sentiment - no more drone-like IVR systems!

Some of the demos of solutions powered by Watson at the show were pretty mind-bending - the exciting thing here is putting this functionality into the hands of developers building even the simplest of appreciations. This combination promises solutions that bring the world of science fiction into the here and now.

This article is published as part of the IDG Contributor Network. Want to Join?

RELATED TOPICS
Must read: 11 hidden tips and tweaks for Windows 10
View Comments
Join the discussion
Be the first to comment on this article. Our Commenting Policies