China is working on a big data Minority Report system

 China is working on a big data Minority Report system
Credit: flickr/eyeliam

Forget China’s deeply creepy Social Credit System, the country’s new pre-crime project is far scarier.

RELATED TOPICS

Think there’s a limit to how far countries can go to monitor their citizens? Think again. China’s new plan to create software to track a wide variety of data to predict who might commit terrorist acts pushes the envelop into the realm of science fiction, a la Minority Report.

Last December, I wrote about China's planned Social Credit System, which takes invasion of privacy to terrifying new levels by going well beyond Western-style credit scores to create a mandatory scheme to "rate the trustworthiness of citizens in all facets of life, from business deals to social behavior,” according to the New Republic. The national database will combine records of Internet data with financial information and government data into a score designed to determine eligibility for all kinds of things, including credit, employment and access to social benefits.

See: The most disturbing tech story of 2015 that no one is talking about

Sounds creepy, right? Well according to a story in BusinessWeek last week, the Social Credit System is now low-tech old news. “The Communist Party has directed one of the country’s largest state-run defense contractors, China Electronics Technology Group, to develop software to collate data on jobs, hobbies, consumption habits, and other behavior of ordinary citizens to predict terrorist acts before they occur,” wrote Shai Oster in China Tries Its Hand at Pre-Crime.

According to Oster, there’s little room for push back on the mostly secret project:

“The project also takes advantage of an existing vast network of neighborhood informants assigned by the Communist Party to monitor everything from family planning violations to unorthodox behavior. … New antiterror laws that went into effect on Jan. 1 allow authorities to gain access to bank accounts, telecommunications, and a national network of surveillance cameras called Skynet.”

While many security professionals remain unconvinced that such a system could reliably predict crime or terrorism, I have a different concern. I’m afraid that the system could work just well enough to be extremely dangerous.

If the planned system shows the ability to successfully identify even a few real “terrorists,” the Chinese regime—or any regime—might be willing to live with some number of “innocent” people getting caught up in the system.

In most situations, the key to Minority Report success requires both predicting crimes that would occur and not accusing people who would not — or even might not — have committed those crimes. But in heightened security environment, fully satisfying the second half of that equation might fall through the cracks.

In a world with committed terrorists publicly seeking weapons of mass destruction, the chance that the lives of a relatively small number of non-terrorists might be destroyed, could be seen as a small price to pay to preserve overall security. After all, if they were identified by the system, they could probably be painted as potential troublemakers anyway, right?

Accepting some false positives in exchange for catching more terrorists is a bargain many countries won’t be willing be make. But others—perhaps including China’s leaders—might find it an acceptable choice.

RELATED TOPICS
Must read: Cisco CEO Robbins: Wait til you see what’s in our innovation pipeline
View Comments
Join the discussion
Be the first to comment on this article. Our Commenting Policies