• United States

With big data comes big responsibility

Nov 14, 20134 mins
Big DataData CenterIBM

Beyond the "creep factor," even enlightened use of big data can have huge human and ethical consequences

Big Data is the buzzword on everyone’s lips these days – promising to change the world through deep insights into vast and complex sets of data. But amidst the optimism at the recent IEEE Computer Society’s Rock Stars of Big Data symposium at the Computer History Museum in Mountain View, Calif., there were also stark about warnings about the dark side of the new technology.

The human/ethical aspects of big data

In fact, Grady Booch, chief scientist at IBM Research and co-founder of Computing: The Human Experience, led off the event with a talk on the Human/Ethical Aspects of Big Data. In front of a couple hundred big-data professionals and interested parties, Booch acknowledged that big data can have “tremendous societal benefits,” but made the case that the technology has gotten way out in front of our ability to understand where it’s going, and we’re likely in for some nasty surprises in the not-so-distant future. He expanded on those thoughts in a private conversation later that afternoon.

Many people worry about governments and corporations misusing big data to spy on or control citizens and consumers, but Booch warned the problem goes much deeper than just deliberate malfeasance: “Even the most benign things can have implications when made public.” He cited the case of an environmental group that shared the locations of endangered monk seals near his home in Hawaii — a seemingly innocuous way to raise awareness. But because monk seals eat fish, Booch said, some local fisherman used the information to try and kill the seals.

Data lasts forever

The problem is that big data doesn’t go away once it’s fulfilled its original purpose. “Technologists don’t give a lot of thought to the lifecycle of data,” Booch said. But that lifecycle can extend indefinitely, so we can never be completely be sure who will end up with access to that data. “This is the reality of what we do.”

“Our technology is outstripping what we know how to do with our laws,” Booch said. “And even today’s best legal and technological controls may not be enough.” Social, political and other pressures can affect how big data is used, he said, despite laws designed to constrain those uses.

Given how the unprecedented speed of technological change is affecting society, what is considered acceptable use of data is in constant flux and subject to contentious debate. For example, while airplanes have long used “black box” data recorders, those devices are now finding their way into cars. So far, that hasn’t raised much debate, but imagine the outcry if we applied the same concept to, say, guns?

Our responsibility: Fix the “stupid things”

“The law is going to do some stupid things,” Booch warned, which is why “technology professionals have a responsibility to be cognizant of the possible effects of the data we collect and analyze to raise the awareness of the public and the lawmakers.”

“The world is changing in unforeseen ways, and no one has the answer,” Booch said. “It’s a brave new world and we’re all making this up as we go along.” But just because something is possible does not necessarily mean we should do it. “We need to at least ask the question: ‘Should it be done?'”

Booch conceded that there is no economic incentive to raising these issues. But there are important considerations and consequences that can’t be measure on a spreadsheet. “Ask yourself, ‘What if the data related to you, or to your parent, or your child? Would that change your opinion and actions?'” If so, Booch said, you have an responsibility to speak out. “If you don’t, who will?”


Fredric Paul is Editor in Chief for New Relic, Inc., and has held senior editorial positions at ReadWrite, InformationWeek, CNET, PCWorld and other publications. His opinions are his own.