• United States

It’s time for the IoT to ‘optimize for trust’

News Analysis
Jun 03, 20194 mins
Internet of ThingsSensors

If we can't trust the internet of things (IoT) to gather accurate data and use it appropriately, IoT adoption and innovation are likely to suffer.

Bose Sleepbuds
Credit: Bose

One of the strengths of internet of things (IoT) technology is that it can do so many things well. From smart toothbrushes to predictive maintenance on jetliners, the IoT has more use cases than you can count. The result is that various IoT uses cases require optimization for particular characteristics, from cost to speed to long life, as well as myriad others.

But in a recent post, “How the internet of things will change advertising” (which you should definitely read), the always-insightful Stacy Higginbotham tossed in a line that I can’t stop thinking about: “It’s crucial that the IoT optimizes for trust.”

Trust is the IoT’s most important attribute

Higginbotham was talking about optimizing for trust as opposed to clicks, but really, trust is more important than just about any other value in the IoT. It’s more important than bandwidth usage, more important than power usage, more important than cost, more important than reliability, and even more important than security and privacy (though they are obviously related). In fact, trust is the critical factor in almost every aspect of the IoT.

Don’t believe me? Let’s take a quick look at some recent developments in the field:

For one thing, IoT devices often don’t take good care of the data they collect from you. Over 90% of data transactions on IoT devices are not fully encrypted, according to a new study from security company Zscaler. The problem, apparently, is that many companies have large numbers of consumer-grade IoT devices on their networks. In addition, many IoT devices are attached to the companies’ general networks, and if that network is breached, the IoT devices and data may also be compromised.

In some cases, ownership of IoT data can raise surprisingly serious trust concerns. According to Kaiser Health News, smartphone sleep apps, as well as smart beds and smart mattress pads, gather amazingly personal information: “It knows when you go to sleep. It knows when you toss and turn. It may even be able to tell when you’re having sex.” And while companies such as Sleep Number say they don’t share the data they gather, their written privacy policies clearly state that they can.

Lack of trust may lead to new laws

In California, meanwhile, “lawmakers are pushing for new privacy rules affecting smart speakers” such as the Amazon Echo. According to the LA Times, the idea is “to ensure that the devices don’t record private conversations without permission,” requiring a specific opt-in process. Why is this an issue? Because consumers—and their elected representatives—don’t trust that Amazon, or any IoT vendor, will do the right thing with the data it collects from the IoT devices it sells—perhaps because it turns out that thousands of Amazon employees have been listening in on what Alexa users are saying to their Echo devices.

The trust issues get even trickier when you consider that Amazon reportedly considered letting Alexa listen to users even without a wake word like “Alexa” or “computer,” and is reportedly working on wearable devices designed to read human emotions from listening to your voice.

“The trust has been breached,” said California Assemblyman Jordan Cunningham (R-Templeton) to the LA Times.

As critics of the bill (AB 1395) point out, the restrictions matter because voice assistants require this data to improve their ability to correctly understand and respond to requests.

Some first steps toward increasing trust

Perhaps recognizing that the IoT needs to be optimized for trust so that we are comfortable letting it do its job, Amazon recently introduced a new Alexa voice command: “Delete what I said today.”

Moves like that, while welcome, will likely not be enough.

For example, a new United Nations report suggests that “voice assistants reinforce harmful gender stereotypes” when using female-sounding voices and names like Alexa and Siri. Put simply, “Siri’s ‘female’ obsequiousness—and the servility expressed by so many other digital assistants projected as young women—provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education.” I’m not sure IoT vendors are eager—or equipped—to tackle issues like that.


Fredric Paul is Editor in Chief for New Relic, Inc., and has held senior editorial positions at ReadWrite, InformationWeek, CNET, PCWorld and other publications. His opinions are his own.