- 15 Non-Certified IT Skills Growing in Demand
- How 19 Tech Titans Target Healthcare
- Twitter Suffering From Growing Pains (and Facebook Comparisons)
- Agile Comes to Data Integration
Network World - TAMPA -- Biometric security breakthroughs are coming that would let the military capture from a distance an iris and facial scan of an individual and immediately match it to a biometrics-based "Watch List" of suspected terrorists, combatants or criminals.
At the Biometric Consortium Conference this week, researchers took the wraps off a variety of prototypes of advanced camera-based systems that are expected to one day be used to remotely capture needed biometrics information, such as iris and face scans, on combatants or suspected terrorists. In this futuristic scenario, no longer would U.S. troops necessarily have to get close to suspected adversaries to collect identifying biometric information as they have done in the past, such as using rugged hand-held fingerprint equipment on suspects in Iraq and Afghanistan since the 9/11 attacks on America.
It's also becoming clearer that flying aircraft drones -- not soldiers -- could also be used to capture biometric data from the enemy from a distance. And that ultimately, a future war could be fought in a highly automated way in which the biometrics of the opponent was used as one variable to target the enemy truly through the whites of their eyes. Scientists point out that an iris scan is even more individual than a DNA test because twins could share the same DNA but their iris scan would be unique.
"Gathering biometrics covertly from a distance — there are dozens of technologies that hold promise," said U.S. Air Force Maj. Mark Swiatek, assistant professor and deputy head, department of philosophy, United States Air Force Academy. "They will be able to be deployed in the next few years."
But the idea of automated killing in war based on "tactical non-cooperative biometrics" in which the military lets "the boxes and systems do all the dirty work" without any real human intervention to make a decision, raises troubling questions, pointed out Swiatek, who spoke at the Biometric Consortium Conference on this topic.
While this high-tech approach in any conflict might well save innocent lives, there's the question of whether such facial and iris-recognition systems have high accuracy rates. And what's high enough to justify mistakes?
"Can we claim proper intention?" Swiatek asked the session audience, noting that even war has philosophical underpinnings that ask for reasoning in what is just in war.
"Human beings always say they didn't mean to do it," said Swiatek, but noted the robotic approach means the intent is programmed into the software. He said such questions need to be carefully answered on both philosophical and legal grounds and perhaps it's time to "slow down the march" of automated war systems.
But the pace of advance might make that hard to do.
Honeywell International is now testing what it calls its "Combined Face and Iris Recognition System (CFAIRS)," discussed in a multi-media presentation to show how it works by Dr. Saad Bedros, Honeywell's principle research scientist.