- 18 Hot IT Certifications for 2014
- CIOs Opting for IT Contractors Over Hiring Full-Time Staff
- 12 Best Free iOS 7 Holiday Shopping Apps
- For CMOs Big Data Can Lead to Big Profits
Network World - IBM's Jeopardy-playing supercomputer is now capable of beating human Jeopardy contestants on a regular basis, but has a ways to go before it takes on the likes of 74-time champion Ken Jennings.
IBM announced plans to build a computer that can win on Jeopardy last April, and expects to stage a public tournament involving human players and the machine within the next year or so.
The question-answering system, nicknamed "Watson", is already doing trial runs against people who have actually appeared on the Alex Trebek-hosted Jeopardy. Watson's competition includes people who qualified for the show but lost, people who appeared and won once, and people who appeared and won twice.
Watson is "working its way up through the ranks," says David Ferrucci, leader of the project team. "We win some, we lose some. Overall, we're quite competitive but there's a ways to go to play the top of the top."
The games are played at IBM's "Watson Research Center" in Yorktown Heights, N.Y., with a real stage and professional host -- though not Alex Trebek. Questions are provided by Jeopardy, and the computer -- about the size of eight refrigerators -- is seen behind a glass window while the two human contestants are at podiums.
The computer has an advantage when it comes to ringing in on questions. Anyone who's seen Jeopardy knows that human contestants can struggle with timing when buzzing in after questions. Watson has great reaction speed.
But the Watson development team faces many challenges in creating a robotic Jeopardy champion. Without being connected to the Internet, the computer has to understand natural language, determine the answer to a question (or, in the case of Jeopardy, the question to an answer), and then calculate the odds that its answer is correct in order to decide whether it is worth buzzing in.
The fear of getting a question wrong and losing money prevents many a wrong answer from a human Jeopardy contestant. At the same time, humans often instinctively know they know the answer to a question, even if it doesn't pop into their heads right away. So a human Jeopardy player will often click the buzzer upon hearing the question, and then spend the next several seconds pulling the answer out of the memory bank. To compete on Jeopardy, a computer must determine whether it knows the correct response within seconds.
Watson also has to be programmed to play strategically. The computer may be reasonably sure it knows the answer to a question, but will take into account the score of the game and the dollar value of the question before deciding whether it is worth taking the risk. It also needs a strategy for choosing categories and a betting strategy for Daily Doubles and Final Jeopardy.
IBM has experience building artificial intelligence systems that can compete against humans. On Feb. 10, 1996, IBM's "Deep Blue" won a game against chess champion Garry Kasparov, but ultimately lost the match 4-2. On May 11, 1997, Deep Blue beat Kasparov in a full match.