- Google I/O 2013's Coolest Products and Services
- 10 Star Trek Technologies That are Almost Here
- 19 Generations of Computer Programmers
- 25 Must-Have Technologies for SMBs
IDG News Service - Wrapping up a three-day run on the "Jeopardy!" game show, IBM's Watson computer has beaten two former champions in a historic match of man versus machine.
The run has successfully demonstrated not only that a computer can beat humans in a trivia question quiz, but, more important, it shows how computers can answer questions much like people do, opening up a potentially new form of human/computer interaction.
HISTORY: Wicked cool man vs. machine moments
In the final episode of the prerecorded two-game, three-night match, Watson had trounced the competition, amassing US$77,147 in winnings over the two "Jeopardy!" champions it played, Brad Rutter and Ken Jennings. Rutter scored $21,600 and Jennings scored $24,000. Watson also took the $1 million champion prize, which IBM will donate to charity.
Run by Sony Pictures Television, "Jeopardy!" is a long-running U.S. TV game show in which three contestants compete to answer trivia questions, arranged into multiple categories and ordered by increasing difficulty. Contestants are given an average of about five seconds to answer a question.
IBM researchers spent four years building Watson. The machine is capable of processing 80 trillion operations (teraflops) per second. It runs about 2,800 processor cores and has 16 terabytes of working memory.
Building such a system to play on "Jeopardy!" proved to be an immense project, one far more challenging even than building a chess-playing supercomputer, which IBM did in the late 1990s.
"It's a much different kind of problem. Chess was very challenging for the time due the mathematics. This was a very different type of program," said Watson lead manager David Ferrucci, in an IBM viewing party held in New York for Wednesday's show. "It's not finite problem or a well-defined space. You are dealing with ambiguity, and the contextual nature of language."
On the software side, the machine uses the Apache Hadoop distributed file system and the Apache UIMA (Unstructured Information Management Architecture), a framework for analyzing unstructured data. Perhaps the most useful software, however, is a natural language processing program called DeepQA that IBM claims can understand a human sentence. This program is what differs Watson from a typical search engine, which can just return a list of results to a set of keywords.
The questions were entered into Watson by text; it did not use voice-recognition technology. For these rounds, "Jeopardy!" eschewed questions that involved audio or video snippets. Watson did, however, answer questions in a smooth synthesized voice.
To build a body of knowledge for Watson, the researchers amassed 200 million pages of content, both structured and unstructured, across 4 terabytes of disks. It searches for matches and then uses about 6 million logic rules to determine the best answers. When given a question, the software initially analyzes it, identifying any names, dates, geographic locations or other entities. It also examines the phrase structure and the grammar of the question for hints of what the question is asking.