While the big news in wearable computers is obviously centered on Google Glass, one company at Google’s own developer conference in San Francisco this week is showcasing the beginnings of a different but potentially very important addition to the world of headware.
Reston, Va.-based APX Labs has created a software framework based on partner Epson’s Moverio headset computer designed to allow developers to build complex and robust augmented reality applications. Put simply, the platform lets you create a virtual indicator – be it a simple direction arrow or complicated industrial device measurement – that appear overlaid on the real world.
Epson product manager for new markets Eric Mizufuka says his company is glad to be in on the ground floor of the project.
“We really feel like the near- to mid-term opportunities for us are in commercial/enterprise/industrial applications,” says Mizufuka.
For APX – pronounced “apex” – it’s about creating the ground floor for others.
“We’re approaching this from a holistic perspective,” says APX R&D director Jay Kim. “Rather than trying to develop one-off apps, what we’re trying to do is to develop a platform upon which other developers … can leverage Epson’s hardware and our platform to rapidly get to a solution that’s going to be ready to go.”
Kim says that the company got its start working as a defense contractor for the U.S. Army.
The company’s first project for the Army was called “stand-off biometrics,” which fed facial recognition data from a camera in the glasses back to a database, allowing soldiers to simply glance at faces to perform ID checks.
The plan, however, was always to create a general-use product.
“We believed, from day one, that those products could be made viable for the non-military customer as well,” says Kim.
Confidentiality was an obstacle to APX’s transition into the civilian marketplace, but not an insurmountable one, he says.
“Some of the underlying biometrics like matching algorithms, for example – those we’re obviously not allowed to transition off, but the underlying messaging and visualization, these are not secret technology,” according to Kim.
The demonstration app, which APX and Epson are showing off at Google I/O, is a visualized method of navigating a YouTube playlist that they were kind enough to demonstrate for me at our meeting. It’s pretty simple – you just tilt your head this way or that to select a video to watch, play and rewind, and return to the playlist – but it responds in a predictable, consistent way, and is quite easy to use.
The ambitions for the system, however, are far more grandiose – a conceptual video shows complex potential uses in emergency response, IT, heavy industry, and a host of other fields.
So why not Google Glass?
“For us to address our markets, we really needed a display … that gave us full control over the user’s line of sight,” says Kim, adding that the slight glance upward needed to view Google Glass’ display could be problematic for applications where maintaining visual focus is critically important.
He was quick to praise Google’s work in bringing the concept of wearable computing into the foreground, however.
“We’re actually big fans of Glass. Google’s done a lot to try and bring the overall industry awareness of the technology to the point where it’s gotten easier for us to get people to buy into the concept,” he says.
Email Jon Gold at firstname.lastname@example.org and follow him on Twitter at @NWWJonGold.