Will advanced biometrics automate future war machines?

Some experts say likely yes, but question "tactical non-cooperative biometrics" against enemy

TAMPA -- Biometric security breakthroughs are coming that would let the military capture from a distance an iris and facial scan of an individual and immediately match it to a biometrics-based "Watch List" of suspected terrorists, combatants or criminals.

Can behavioral biometrics help prevent terrorist from entering the U.S.?

At the Biometric Consortium Conference this week, researchers took the wraps off a variety of prototypes of advanced camera-based systems that are expected to one day be used to remotely capture needed biometrics information, such as iris and face scans, on combatants or suspected terrorists. In this futuristic scenario, no longer would U.S. troops necessarily have to get close to suspected adversaries to collect identifying biometric information as they have done in the past, such as using rugged hand-held fingerprint equipment on suspects in Iraq and Afghanistan since the 9/11 attacks on America.

It's also becoming clearer that flying aircraft drones -- not soldiers -- could also be used to capture biometric data from the enemy from a distance. And that ultimately, a future war could be fought in a highly automated way in which the biometrics of the opponent was used as one variable to target the enemy truly through the whites of their eyes. Scientists point out that an iris scan is even more individual than a DNA test because twins could share the same DNA but their iris scan would be unique.

"Gathering biometrics covertly from a distance — there are dozens of technologies that hold promise," said U.S. Air Force Maj. Mark Swiatek, assistant professor and deputy head, department of philosophy, United States Air Force Academy. "They will be able to be deployed in the next few years."

But the idea of automated killing in war based on "tactical non-cooperative biometrics" in which the military lets "the boxes and systems do all the dirty work" without any real human intervention to make a decision, raises troubling questions, pointed out Swiatek, who spoke at the Biometric Consortium Conference on this topic.

While this high-tech approach in any conflict might well save innocent lives, there's the question of whether such facial and iris-recognition systems have high accuracy rates. And what's high enough to justify mistakes?

"Can we claim proper intention?" Swiatek asked the session audience, noting that even war has philosophical underpinnings that ask for reasoning in what is just in war.

"Human beings always say they didn't mean to do it," said Swiatek, but noted the robotic approach means the intent is programmed into the software. He said such questions need to be carefully answered on both philosophical and legal grounds and perhaps it's time to "slow down the march" of automated war systems.

But the pace of advance might make that hard to do.

Honeywell International is now testing what it calls its "Combined Face and Iris Recognition System (CFAIRS)," discussed in a multi-media presentation to show how it works by Dr. Saad Bedros, Honeywell's principle research scientist.

"There is a convergence between video surveillance and biometrics," said Bedros, noting the Honeywell project to detect and track faces as they move is done through a network of distributed cameras. It also makes use of iris scans of individuals that can be captured from a distance — it can all happen at up to about 4 meters — through what scientists call "standoff ocular recognition."

The changing face of biometrics

The idea is to scan and track individuals on the move and be able to tag them for a "Watch List" if their biometric matches something stored elsewhere that suggest the individual is a danger. CFAIRS is still in beta testing, and has no specific release date, but it's likely use would be in guarding entry and checkpoints. While CFAIRS so far is showing as high as a 95% accuracy rate, said Bedros, though "it depends on a subject's cooperation."

That wasn't the only eye-popping stuff that had researchers specializing in biometrics, which entails trying to find accurate ways to match fingerprints, iris scans, voiceprints, buzzing at the conference.

Dr. Marios Savvides, associate research professor at Carnegie-Mellon University's CyLab Biometrics Center, gave a presentation on long-range iris-capture prototype, which has been funded by the military's Biometrics Identity Management Agency (BIMA). BIMA is the U.S. Department of Defense organization based in Clarksburg, W.Va., which maintains a database housing over 6 million finger, palm and iris biometrics on individuals, most of them encountered in Iraq and Afghanistan since troops went there following the 9/11 terrorist attacks in the U.S.

The camera-based system, which can work at about 12 meters, is supposed to be able to automatically pan and tilt to capture iris scans throughout a crowd. Savvides said the gear should properly be mounted on a military vehicle, such as a tank, and used to scan a crowd remotely at a checkpoint. The long-range iris-recognition equipment Carnegie-Mellon University has put together include a "soft biometrics" for identifying individuals based on gender, ethnicity and age, too, plus whether they have a moustache or wear glasses. "We're looking at people trying to evade the system," said Savvides. "We have a beard category."

Identifying ethnicity through biometrics data is a very hot topic in research on what is called "soft biometrics" right now.

Dr. Kevin Bowyer, professor at the University of Notre Dame, gave a presentation at the conference that explained how by analyzing iris texture, it's possible to determine with about 90% accuracy whether someone is Asian or Caucasian. But gender accuracy seems to be much harder, at only 60%. And women "seem to be more complex than males" in terms of trying to determine gender though iris texture, he said.

The university, which wrote its own software for the purpose of iris texture measures and filters for "spot detectors", carried out its experiments with 120 people who volunteered, self-identifying as "Asian" or "Caucasian." Bowyer said it should be possible to try something similar in iris-texture measurements with about a half dozen other ethnicities.

Though advanced biometrics systems for the military may not be too far out on the horizon, there's also the reality that Washington, D.C. is wrestling with budget cuts across the government that seem certain to hit the military as much as civilian agencies.

The FBI, Department of Homeland Security and the intelligence agencies and the military all have their own separate types of biometrics-enabled "watch lists" today. Information is shared on a somewhat ad hoc basis. There is a commonly-shared "Watch List" today but it's mainly based on names, not biometrics. The goal, say military leaders, is to try and create a shared biometrics-enabled "watch list" database — called BEWL for short — with several other agencies. It's also necessary to further develop the policies and legal foundations needed to permit access and work with allies.

"And we need to stop fixating on the hardware and focus on the software. Much of the functionality is contained there," said John Boyd, director, defense biometrics and forensics, assistant secretary of defense, research and engineering.

The military will also have to get away from what has been a hugely proprietary approach to biometrics in a wartime setting abroad.

"We want automated real-time sharing between Defense, FBI and Department of Homeland Security," said Dr. Konrad Trautman, director of intelligence, J2, U.S. Special Operations Command in his keynote address, saying standards were are needed for all of it.

Learn more about this topic

Rapid DNA analysis tool closer for feds but obstacles loom

U.S. military takes cloud-computing to Afghanistan

Can behavioral biometrics help prevent terrorist from entering the U.S.?

From CSO: 7 security mistakes people make with their mobile device
Join the discussion
Be the first to comment on this article. Our Commenting Policies