- Google I/O 2013's Coolest Products and Services
- 10 Star Trek Technologies That are Almost Here
- 19 Generations of Computer Programmers
- 25 Must-Have Technologies for SMBs
Network World - When workplace computers moved beyond command-line interfaces to the mouse-and-windows-based graphical user interface, that was a major advance in usability. And the command line itself was a big improvement over the punch cards and tape that came before.
We're now entering a new era of user interface design, with companies experimenting with everything from touch and voice to gestures and even direct mind control. But which of these new interfaces is appropriate for a corporate environment, and which are simply not ready for prime time? (Watch a slideshow version of this story.)
Voice recognition is one input technology that has made significant progress. A decade ago, accuracy was low and the technology required extensive training. Today, it is common to find voice recognition when calling customer support, and, of course, in the latest smartphones.
For general office use, however, voice recognition has made the biggest impact in specialized areas, like law and medicine. At the University of Pittsburgh Medical Center, for example, automated transcription has almost completely replaced human transcriptionists in the radiology department.
"The big thing in radiology is how can we go through as many studies as we possibly can," says Rasu Shrestha, the hospital's vice president for medical information technologies. "Turn-around time is incredibly important, as is the accuracy of the report."
The fact that the job itself is extremely routine is also important, he added. "We sit down, we look at images, and we write reports," he says. "It's a fairly mundane task."
Shrestha says he began working with voice recognition a decade ago, and it was "horrendous" at first. "We had constant struggles, especially if you had any level of an accent. But things have come a long way. The Dragon Medical Engine [from Nuance] incorporates a lot of the medical ontology and vocabulary structures, so the platforms are intelligent."
As a result, accuracy has gone from around 70% to 80% 10 years ago, to close to 100% accuracy today. Meanwhile, human transcription has actually fallen in accuracy as hospitals have moved from using dedicated secretaries who would get to know a doctor's voice to outsourced transcription services.
"There's no opportunity for you to build a bond with any particular person sitting at the back end of the transcription service," he says. Another reason that machine transcription is now better is that users can set up macros that automatically take care of a large chunk of work.
"If you have a normal chest X-ray, you could short-cut the entire documentation process," he says. "You can just turn on the mike and say, 'Template normal chest' and it automatically puts everything in, adds the context of the patient's name and age, and there you go - in seconds, you have created a full report that might have taken several minutes before. I would say that the days of the human transcriptionist are numbered."
Finally, machine transcription dramatically speeds up the workflow. "A decade ago, five years ago, when we were using traditional transcription service, it used to be anywhere from a day to several days before the final report was sent back," he says. "Today, it's anywhere from seconds to a couple of minutes. The minute the patient is in the scanner and the scan is completed, it's in our work list. Sometimes within seconds or minutes of the study being available to us, the ordering clinician has the report available to them. It clearly increases our productivity and streamlines the process."