• United States

IoT and the law of unintended consequences

News Analysis
Jan 31, 20184 mins
Consumer ElectronicsInternet of ThingsSecurity

Strava’s fitness app usage heatmap accidentally reveals military deployments — and it won’t be the last time good IoT intentions lead to unanticipated problems.

bose soundsport free workout
Credit: Bose

You’ve probably already heard about the latest Internet of Things (IoT) security fiasco — coverage has gone far beyond the tech press into the mainstream TV news. In case you haven’t been paying attention, though, here’s the elevator pitch version:

Fitness network Strava publishes a global heatmap of where users are running and working out using its services, and folks just figured out that the map includes information that could reveal the locations of military forces working out in sensitive and sometimes secret locations. One expert worried that “tracking the timing of movements on bases could provide valuable information on patrol routes or where specific personnel are deployed.”

Unlike other IoT security concerns, Strava’s situation doesn’t involve hacking, spearfishing, compromised security protocols, or anything like that. In fact, Strava’s service is working exactly as it was intended, letting folks see where others are running and exercising around the world. The problem is the data reveals previously unseen patterns that could be used in ways Strava, or the security personnel sharing their workout data, never considered.

The Pentagon is concerned

The problem isn’t trivial. According to CNN, “Defense Secretary James Mattis has been made aware of the issue, and the DoD is reviewing policy regarding smartphones and wearable devices.” A Pentagon spokesman told CNN, “We take these matters seriously, and we are reviewing the situation to determine if any additional training or guidance is required, and if any additional policy must be developed to ensure the continued safety of DoD personnel at home and abroad.”

It doesn’t sound like addressing this particular issue will be that difficult. It’s mostly a matter of telling soldiers in sensitive locations to turn off the sharing functions of the Strava app. But, once again, this situation points to a larger problem with new IoT technology.

Put simply, using smart devices to gather and report previously unavailable data has complex implications that can’t always be figured out in advance. The deeper you look at how IoT devices are being used, the more potential flashpoints crop up.

In this example, while it may be easy to get soldiers to stop sharing Strava data, who knows what other devices, apps, and services they’re using, and what data may be collected. In most cases, that process is all pretty innocent, but that doesn’t mean the data can’t be used in hard-to-predict ways. Lots of things track location these days, and in many cases, location data can tell a remarkably detailed story about what folks may be doing.

Strava is only the beginning

Location is only the beginning. In the run up to Mardi Gras in New Orleans, the New York Times wonders whether 1,500 cameras that barkeeps in the French Quarter and elsewhere have been required to install to deter crime could also put a damper on the festivities. After all, during Mardi Gras, flashing has long been considered part of the fun. With all the cameras running, though, what happens in NOLA is less likely to stay in NOLA.

This isn’t idle speculation. Nanny cams have revealed plenty of things that their owners may not have wanted to know. IoT-powered toys have been criticized for listening in on children, as well as being vulnerable to hacking to collect other personal information. Cellphone records and automated toll-road data have aleady been subpoenaed in divorce and other cases. Lawyers, insurance companies and ethicists are worried about the decisions autonomous cars may have to make in life-or-death situations. Then there was the time a software glitch caused Nest thermostats to shut down in the middle of winter, leading to burst pipes and flooding. Similarly, relying on IoT devices in remote places can force you to go to great lengths to repair or replace them.

Of course, unintended consequences are hardly unique to IoT. U.S. cybersecurity tools have been hijacked and are now being used by criminal hackers. And who hasn’t wondered what true artificial intelligence entities might get up to once they realize they exist?

Before we panic and outlaw all technological innovation, though, consider that unintended consequences aren’t always bad. In fact, they’re often the source truly breakthrough technological innovations, from microwave ovens and X-ray images to Amazon Web Services, which was born from the company’s own internal operations.


Fredric Paul is Editor in Chief for New Relic, Inc., and has held senior editorial positions at ReadWrite, InformationWeek, CNET, PCWorld and other publications. His opinions are his own.