- 15 Non-Certified IT Skills Growing in Demand
- How 19 Tech Titans Target Healthcare
- Twitter Suffering From Growing Pains (and Facebook Comparisons)
- Agile Comes to Data Integration
IDG News Service - Kevin Finisterre isn't the type of person you expect to see in a nuclear power plant. With a beach ball-size Afro, aviator sunglasses and a self-described "swagger," he looks more like Clarence Williams from the '70s TV show "The Mod Squad" than an electrical engineer.
But people like Finisterre, who don't fit the traditional mold of buttoned-down engineer, are playing an increasingly important role in the effort to lock down the machines that run the world's major industrial systems. Finisterre is a white-hat hacker. He prods and probes computer systems, not to break into them, but to uncover important vulnerabilities. He then sells his expertise to companies that want to improve their security.
IN PICTURES: 12 'white hat' hackers you should know
Two years ago, Finisterre, founder of security testing company Digital Munition, found himself swapping emails with a staffer at Idaho National Laboratory's Control Systems Security Program, a project funded by the U.S. Department of Homeland Security that is the first line of defense against a cyberattack on the nation's critical infrastructure.
Finisterre caught the attention of INL in 2008, when he released attack code that exploited a bug in the CitectSCADA software used to run industrial control environments. He'd heard about the INL program, which helps prepare vendors and plant operators for attacks on their systems, and he thought he'd drop them a line to find out how good they really were.
He was not impressed.
Is INL already working with the hacker community? Finisterre wanted to know. He received an off-putting response. The term "hacker" denotes a person of a "dubious or criminal nature" who would "not be hireable by a national laboratory," an INL staffer told him via email.
"He basically lectured me about how INL doesn't interact with hackers and I should be very careful throwing that word around," Finisterre recalled. "I was like, 'Dude, I really hope you're joking, because you're supposed to be at the forefront of the research on this."
Call it an early skirmish in a culture clash between two worlds: the independent security researchers accustomed to dealing with tech firms such as Microsoft and Adobe, who have learned to embrace the hacker ethos, and the more conservative companies that develop and test industrial control systems, who often act like they wish these white-hat hackers would go away.
Earlier this year, Dillon Beresford, a security researcher at the consultancy NSSLabs, found a number of flaws in Siemens' programmable logic controllers. He had no complaints about the U.S. Department of Homeland Security's Industrial Control Systems Cyber Emergency Response Team, run out of INL. But he said Siemens did a disservice to its customers by downplaying the issues he'd uncovered. "I'm not pleased with their response," Beresford said earlier this year. "They didn't provide enough information to the public."
ICS-CERT was set up two years ago to handle the kind of bugs that Beresford and Finisterre are now finding with ease. The number of incidents funneled through ICS-CERT has increased six-fold in the past few years, from dozens of issues to hundreds, according to Marty Edwards, director of the Control Systems Security Program and the person in charge of ICS-CERT.