Now that you’ve got the basics of using smartphone and tablet touchscreens down, Carnegie Mellon University researchers are ready to take you to the next level.
Now that you’ve got the basics down of using smartphone and tablet touchscreens, Carnegie Mellon University researchers are ready to take you to the next level.
SLIDESHOW: Evolution of the Desktop GUI
They will show off technology this week called TapSense that takes more advantage of all the touchy-feelyness of your fingers to better control computing devices such as iPhones and Android tablets. For example, using your fingernails can signal one thing to a device, whereas pressing the touchscreen with the pad of your fingertip or even a knuckle can send a different instruction (See video below for a better “feel” for the technology).
The technology involves the use of a microphone attached to the touchscreen that enables the CMU scientist to distinguish between a fingernail, fingertip or knuckle. A proof-of-concept system could distinguish between four types of finger outputs with 95% accuracy, according to CMU.
One goal of the TapSense team is to eliminate the need for buttons and other space-hogging conventions for taking action on a device and making better use of the sometimes limited screen size.
“TapSense basically doubles the input bandwidth for a touchscreen,” said Chris Harrison, a Ph.D. student in Carnegie Mellon’s Human-Computer Interaction Institute (HCII), in a statement. “This is particularly important for smaller touchscreens, where screen real estate is limited. If we can remove mode buttons from the screen, we can make room for more content or can make the remaining buttons larger.”
The same technology could be used to distinguish between different tools, such as pens that write different colors, used to write or draw on a tabletop touch surface
Harrison developed TapSense with fellow Ph.D. student Julia Schwarz as well as with Scott Hudson, a CMU professor. Harrison is discussing the technology this week at the Association for Computing Machinery’s Symposium on User Interface Software and Technology in Santa Barbara, Calif.
Things could get really interesting if TapSense ever gets integrated with OmniTouch, a wearable computer from CMU and Microsoft that can turn any surface into a touchscreen.
Circle Bob on Google+