Intelligence agency opens $325,000 advanced, automated fingerprint gathering competition

The Intelligence Advanced Research Projects Activity (IARPA) wants the public to help build a next-generation, automated fingerprint recognition system.

intelligence-agency-opens-325-000-advanced-automated-fingerprint-gathering-competition
Credit: Reuters

Researchers at the Intelligence Advanced Research Projects Activity (IARPA) are looking to the public to build a next-generation, automated fingerprint recognition system.

The idea behind the competition, called the “Nail to Nail (N2N) Fingerprint Challenge” – which offers $325,000 worth of prizes – is to develop a system that allows for more distinguishing data to be collected from fingerprint biometrics but also eliminates the time and cost associated with using human operators, IARPA said. N2N fingerprints capture the entire fingerprint from the edge of one finger nail bed to the other.

+More on Network World: National Intelligence office wants to perfect the art of security deception+

From IARPA: “This challenge seeks to identify technology that can perform live capture fingerprints without requiring a human operator1for the purposes of matching against other latent or live capture of fingerprints. The developed system should collect fingerprint data that performs as good as, or better than, existing operator controlled N2N fingerprint collection approaches. Performance of the developed N2N collection systems will be evaluated using data collected from a live test using human subjects and encompasses both live and latent fingerprints. The participant collected data will be compared against “gold standard” N2N and latent data using conventional fingerprint recognition algorithms. Participants will be judged based on traditional biometric performance measures in addition to speed of the collection process. Participants are not required to develop algorithmic/software techniques to match N2N or latent data.”

+More on Network World: 10 of the latest craziest and scariest things the TSA found on your fellow travelers+

IARPA went on to say that existing fingerprint technology employs a trained human operator who holds and physically ‘rolls’ the subject’s fingerprints over a surface to capture the complete print. Slap (or plain) fingerprints, an alternative form of capture, utilize a single press method that does not require human operation. However, they only capture the parts of the finger touching the sensor, providing significantly less surface area and decreased matching performance for live and latent fingerprint recognition, the agency said.

IARPA said the challenge will run in two stages through the fall of 2017 and end in a live test where finalists will be invited to test out their devices. A variety of prizes will be offered from best matching system to fastest scan and best latent accuracy.

IARPA latest challenge is but one of such prize competitions backed by the government. A report last year by the White House Office of Science and Technology noted it had been over six years that the government set the America Competes Act which in combination with Challenge.gov has prompted more than 700 public-sector prize competitions that have doled out more than $80 million in prizes.

The Office of Science and Technology says prize competitions and challenges have an established record of spurring innovation in the private and philanthropic sectors letting the government:

  • Pay only for success and establish an ambitious goal without having to predict which team or approach is most likely to succeed;
  • Reach beyond the “usual suspects” to increase the number of solvers tackling a problem and to identify novel approaches, without bearing high levels of risk;
  • Bring out-of-discipline perspectives to bear;
  • Increase cost-effectiveness to maximize the return on taxpayer dollars;
  • Establish clear success metrics and validation protocols that themselves become defining tools and standards for the subject industry or field.

The report noted several successful challenges including:

  • IARPA’s Automatic Speech Recognition in Reverberant Environments (ASpIRE) Challenge: ASpIRE challenged teams to apply and refine state-of-the-art speech-to-text (STT) techniques to transcribe recordings of native speakers of American English. Typically, speech-recognition systems are trained on speech recorded in environments very similar to those in which they are expected to be used. The ASpIRE challenge tackled a more ambitious problem of building accurate systems for automatically transcribing speech recorded in noisy and reverberant environments without any training data that resembled the challenge’s final test conditions—and without knowing anything about the recording devices used, the placement of the talker relative to the recording device, or the acoustics of the rooms where the speech was recorded.
  • NASA’s Disruption Tolerant Networking Challenge Series (DTN): The DTN Series is an ambitious, multi-year series of challenges to develop data networking protocols that can extend the Internet into the Solar System. The challenges helped improve the security, performance, and application of network protocols that can withstand the time delays caused by the immense distances between planets and the disruptions and non-contiguous paths of the space communication links. The series of challenges included two challenges in 2013, two challenges in 2014, and three challenges in 2015.The most significant challenge to close in 2015 was the Astronaut Email Challenge.
  • DARPA’s Cyber Grand Challenge: The now completed DARPA Cyber Grand Challenge (CGC) utilized a series of competition events to test the abilities of a new generation of fully automated cyber-defense systems. CGC teams created automated systems to compete against each other to evaluate software, test for vulnerabilities, and generate and apply security patches to protected computers on a network. The competition drew teams of top experts from across a wide range of computer-security disciplines. Collectively, the automated systems participating in CGC were able to mitigate all currently known security flaws in the sample software (no individual system accomplished this). Competitors’ systems were able to identify 96 of the 131 security vulnerabilities (73%) in the software challenges without human assistance.

Check out these other hot stories:

Has Cisco broken out of the network hardware box?

10 of the latest craziest and scariest things the TSA found on your fellow travelers

Air Force goes after cyber deception technology

DARPA wants to simulate how social media spreads info like wildfire

Cisco calls on Arista to stop selling products in US after agency reverses patent finding

IBM: Next 5 years AI, IoT and nanotech will literally change the way we see the world

Cisco extends Ericsson partnership with WiFi package

Cisco talks 2017 SD-WAN predictions

Snapshot: NASA’s “Human Computers” and the Hidden Figures movie story

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.
Must read: 10 new UI features coming to Windows 10