MIT researchers bring Javascript to Google Glass

Open Source Wearscript puts Javascript on Google Glass, with many new, and some unexpected, input choices.

image alt text

Earlier this week, Brandyn White, a PhD candidate at the University of Maryland, and Scott Greenwald, a PhD candidate at MIT, led a workshop at the MIT Media Lab to showcase an open source project called WearScript, a Javascript environment that runs on Google Glass. The category of wearables is still evolving. Besides activity trackers and smartwatches, the killer wearable app is yet to be discovered because wearables don’t have the lean back or lean forward human-machine interface (HMI) of tablets and smartphones. Wearscript lets developers experiment with new user interface (UI) concepts and input devices to push beyond the HMI limits of wearables.

The overblown reports of Google Glass privacy distract from the really important Google Glass discussion - how Glass micro apps can compress the time between user intent and action. Micro apps are smaller than apps and are ephemeral because they are used in an instant and designed to disappear from the user's perception once completing their tasks. Because of the Glass wearable form factor, micro apps deviate from the LCD square and touchscreen/keyboard design of smartphone, tablet, and PC apps, and are intended to be hands-free and responsive in the moment. Well-designed Glass apps employ its UI to let the user do something that they could not otherwise do with another device. Glass’s notifications are a good example of this; want to get breaking news or preview important email without interruption from a phone or PC? Tilt your head up slightly and capture it in a glance, but if you want to read the news or give a detailed response to an email – better to pick up a smartphone, tablet or PC. The best consumer-facing Google Glass experiences highlight how apps can leverage this micro app programmable wearable form factor.

Early on during the MIT Media Lab workshop, White demonstrated how Glass's UI extends beyond its touchpad, winks, and head movements by adding a homemade eye tracker to Glass as an input device. The camera and controller were dissected from a $25 PC video camera and attached to the Glass frame with a 3D-printed mount. A few modifications were made, such as replacing the obtrusively bright LEDs with infrared LEDs, and a cable was added with a little soldering. The whole process takes about 15 minutes for someone with component soldering skills. With this eye tracker and a few lines of Wearscript, the researchers demonstrated a new interface by playing Super Mario on Google Glass with just eye movements.

To this audience of software engineers, wearable enthusiasts, students, and hardware hackers, repurposing an inexpensive device with some hacking and soldering is not unusual. But the impact of the demonstration set the tone for rethinking Glass apps with Wearscript and unconventional Glass input devices.

A dramatic Myo Gesture Control Bracelet that operates by detecting motion and muscle contraction was used to control the Glass display with Wearscript. A little less dramatic but equally effective input demonstration was a magnetic ring that controlled the Glass display by swiping a Bluetooth Low Energy (BLE) tethered Android phone with the magnetometer enabled in the user's pocket. For the many attendees working at designing and developing next-generation wearable concepts, the diverse input device’s demonstrations broaden the scope of expectations for the workshop and push the limits on the notion of the wearable user experience.

Wearscript is really interesting because it uses a JavaScript framework that extends the design center of Google Glass micro apps from mainly Android Java developers to front-end developers. Front-end developers who typically design the user experience are more concerned with testing and iterating user experience to optimize designs. JavaScript is a much more accessible language than Java for front-end developers, many of whom already know it.

After two days of coding and soldering, White and Greenwald achieved the results they intended - wearable micro apps that reconsidered HMI and some new wearable concepts. A Google Glass micro app used augmented reality to identify room layouts for a blind person and guide them with voice response. An open source game designed for a smartphone screen and touchpad was adapted and played on Google Glass with just the Myo bracelet using hand movement. Using the homemade eye tracker to measure pupil dilation, a dating micro app simulated a measurement of the affinity between two people on a first date.

White and Greenwald proved using Wearscript and the groups’ creativity that the definition of wearables isn’t complete yet, and the missing ingredient to the wearable killer app is new input devices to simplify HMI.

The Wearscript code repository and its contributors are on Github.

Must read: Cisco CEO Robbins: Wait til you see what’s in our innovation pipeline
View Comments
Join the discussion
Be the first to comment on this article. Our Commenting Policies