If you want to understand the pitfalls to serving America's post-Snowden craving for secure technology, look no further than Anonabox, the crowd-funded privacy-enhancing home router. That project raised an astonishing $630,000 last year in campaigns on the crowd funding websites Kickstarter and Indiegogo by promising to protect its customers' online privacy with a box that redirected all their Internet traffic through the anonymous ToR network.
As Wired.com reported last week, Anonabox's parent company, Sochule Inc., had to recall devices it shipped to customers after an independent researcher discovered serious security flaws in the product. Those flaws would make it easy for anyone within wireless range of the Anonabox to connect to- and take control of the device. Quite simply: Anonabox - designed to offer extra security for Internet users- was actually proved less secure even than the consumer broadband routers its was supposed to supplant.
From security-enhanced smart phones and web browsers to broadband routers, consumers who want to cloak their identity online, lock down their data and protect their privacy have more choices than ever. But recent events suggest that the label 'security enhanced' often denotes the opposite, as products designed to be ultra secure cough up head-slapping security flaws.
In July, for example, the security firm Exodus Intelligence warned of a serious and exploitable vulnerability in TAILS OS, the secure and portable Linux based operating system that is known to be a favorite of Edward Snowden. According to a report on The Verge, the vulnerability would allow an attacker to defeat TAILS OS's anonymity features and even run malicious code.
Likewise, researchers at Columbia University published a paper in 2014 that described a method for unmasking users of the ToR anonymizing service using downstream routers that process traffic from ToR users.
The road to privacy riches is littered with potholes, it would seem. But security experts caution that the real problem may be bigger than vulnerabilities hidden in application code. A variety of factors –from unrealistic consumer expectations to competing commercial interests to shadenfreude – may be conspiring to sink otherwise promising development projects, experts say.
"Here's the problem," explains John Dickson, a Principal at the Denim Group. "You have to be as good as the other, less secure tool and you have to be secure. That's a bridge too far for many products."
Add to that the fact that merely releasing and promoting a "secure" software or hardware product puts a target on the back of the company or individuals promoting it. "Designed for security products don't just have to be good. They have to be beyond reproach," Dickson observes. "All it takes is one guy with a grudge to undo you."
Consider the experience of WhiteHat Security, which released its Aviator web browser in October 2013, billing it as a "Safer Web Browser" built on top of Google's Chromium platform. Jeremiah Grossman, WhiteHat's CTO, said that almost all of the code in Aviator was the work of Google, not WhiteHat.
"While we did add some new code and features into Chromium, the effort was largely one of making default security and privacy configurations — then making sure doing so didn't break the user experience," he wrote in an e-mail.
But WhiteHat's work caught the attention of Google researcher Justin Schuh, who penned a withering assessment of WhiteHat's Aviator secure web browser in January of this year, shortly after WhiteHat released the Aviator code to the open source community.
"You probably shouldn't be using the WhiteHat Aviator browser if you're concerned about security and privacy," Schuh declared at the beginning of his post. In it, he took WhiteHat to task for not keeping its browser up to date with the most current version of Chromium and for adding code that created new security vulnerabilities. "The added code doesn't seem to have been written with a sufficient understanding of how Chrome works, or with adequate regard for security," Schuh said.
WhiteHat's developers chose not to work with the large, open source Chromium community and even changed internal scheme names from "chrome" to "aviator," Schuh complained.
Valid as those criticisms may be, the way in which they were aired and the tone of the critique highlights a danger in pursing "enhanced security" as a marketing goal in and of itself.
"You look at companies like Google, Microsoft, Mozilla – they have so much revenue tied to their browsers, it colors all this stuff." Letting Aviator sit on top of the Chrome hill was bound to be antithetical to Google's interests, Schuh said. "WhiteHat was going to end up in the crosshairs."
The presence of vulnerabilities in any software, but particularly early releases is a "dog bites man story" says Jon Callas, the co-founder and CTO of the company Silent Circle – except in the security industry. "I see this hunger whenever someone claims 'I'm doing something and I'm doing it securely' to come along and say "no it isn't.'"
That dynamic is an ingrained part of the security industry, where researchers get attention and even monetary rewards for finding vulnerabilities, and where reporters can draw readers to a story about the secure product that wasn't, Callas said.
"What you're seeing is a different standard," said Bruce Schneier, the CTO at Resilient Systems. "If you call your browser 'secure by design' and it ends up having vulnerability, then you failed. If you call it 'just a regular browser' and it ends up with 1,000 vulnerabilities, everyone says 'look at how good it is!'"
Dickson, of Denim Group, agrees. "The threshold for mockery is low. The threshold for sustained credibility is high," he said.
There's a cost to that, of course. Promising projects or products that may be shelved or lose momentum in the face of harsh criticism. That is, arguably, the fate of WhiteHat's Aviator browser, which saw its last commit on GitHub on February 9.
A bigger threat may be that secure products are not pursued for fear of ending up in the crosshairs of fellow security researchers.
"That sort of attitude is to the advantage of people like me," said Callas of Silent Circle, which makes privacy-enhancing communications tools. "You get a Twitter storm going of 'ha ha you made a mistake!' ... I know how to respond to that kind of criticism because I've been there before. But you have new people coming up who are brilliant and passionate, and they won't necessarily deal with it well."
One way around that is to solicit the help of larger entities with the resources and motivation to make the project a success. Successful projects also find a way to spread the fruits of their security and privacy enhancing product or technology around.
Removing the stain of marketing and self-promotion from the effort also helps, experts agree. "Be humble in your claims," Callas advised. "People will rightly mock anything that says it is 'NSA proof' because it is easy to disprove that."
In the end, the market and the security community need to support efforts to develop more secure alternatives to technology in use today – even if those efforts are purely commercial and fueled by self interest. "These people are reflecting the fact that we all want privacy and security in everything we do," said Callas. "It's OK to point financial incentives in their direction."
This story, "Why 'designed for security' is a dubious designation" was originally published by ITworld.