Legislation, financially driven attackers, and high profile breaches have changed the economics of security. We need to rethink the motivations of attackers and the new attacker economy given a growing stolen identity information trade and the rise of organized electronic crime. We need to study Hackernomics. This is a new term so allow me to offer a definition:
Legislation, financially driven attackers, and high profile breaches have changed the economics of security. We need to rethink the motivations of attackers and the new attacker economy given a growing stolen identity information trade and the rise of organized electronic crime. We need to study hackernomics. This is a new term so allow me to offer a definition:
Hackernomics (noun, singular or plural): A social science concerned with description and analysis of attacker motivations, economics and business risk. It is characterized by five fundamental laws and eight corollaries.
Law 1
Most attackers aren’t evil or insane; they just want something.
Some folks work on the premise that hackers are evil but in reality most attackers are looking for weak targets and the path of least resistance. This is actually very good news and leads us to Corollary 1.a.
Corollary 1.a.:
We don’t have the budget to protect against evil people but we can protect against people that will look for weaker targets
This tells us that security as well as security theater -- the appearance of security -- are critical in reducing business risk. Many companies struggle on what level of investment to put into security. Entry-level security means barely passing compliance audits but companies that just squeak by are unlikely to be spared from attacks if they hold valuable information. This means that entry-level security must be at least as high as industry norms, especially considering that if a breach does occur, the Federal Trade Commission will compare the victim’s security policy with industry “best practices."
Law 2
Attackers may attack you; auditors will show up.
Many organizations fear a compliance violation more than a breach. This is mainly because audits have created an impending event; someone will inspect security, which creates a much more compelling security business case than fear of a possible attacker.
Corollary 2.a.:
Security isn’t about protecting something completely; it’s about reducing a risk at some cost.
As an industry we don’t know how to make non-trivial systems 100% secure but we can mitigate risks by investing wisely in training, process improvement and tools.
Corollary 2.b.:
In the absence of metrics, we tend to over-focus on risks that are either familiar or recent.
Two of the most common mistakes in security spending are overcompensating and overspending on risks that have been in the media or are familiar. In 2006, a huge amount of money was spent on data encryption. While a major risk for corporations, a great deal of that spending was prompted by highly publicized breaches and not on a thorough comparative analysis of business risks.
Law 3
Most costly breaches come from simple failures, not from attacker ingenuity.
Lost backup tapes, stolen laptops and unsecured servers were to blame for most of the high-profile data breaches since the inception of the first U.S. disclosure legislation, California Senate Bill 1386.
Corollary 3.a.:
Bad guys, however, can, be very creative if given incentive.
Botnets, organized attacks and distributed denial-of-service attacks have shown that attackers can be very creative when motivated by target value or prestige.
Law 4
In the absence of security education or experience, people (developers, users, testers, designers) make poor security decisions with technology
Technology often masks our natural instinct of safety and security. Because few developers and users are properly educated on risks, especially software security risks, decisions are often made to favor capabilities that they are more familiar with (such as usability and performance), which often run directly counter to security.
Corollary 4.a.:
Software needs to be easy to use securely and difficult to use insecurely.
Users, developers and system administrators need software that is unobtrusively secure with secure defaults. This recognizes the fact that systems developers know -- or should -- more about the security of their systems than users, but that security should still be tunable to meet the needs of advanced users.
Corollary 4.b:
Developers are smart people who want to do the right thing.
Incomplete requirements, undocumented assumptions, lack of security knowledge and bad metrics can push them to do the wrong thing.
Law 5
Attackers usually don’t get in by breaching a security mechanism; they leverage functionality in some unexpected way.
Attackers take the path of least resistance, which usually isn’t fighting security controls directly. Instead, they are more likely to look for an architectural or coding flaw (such as SQL injection) in the non-security code that powers the business.
Corollary 5.a.:
Security is as much about making functional code secure as it is about adding security controls.
We often spend a lot of time focusing on security mechanisms or security code when the greatest risks lie in the core way a system works.
In the new security frontier we need to protect network, information and business assets by empowering people through security education.
Dr. Thompson is chief security strategist for People Security, a provider of software security education. He can be reached at hthompson@peoplesecurity.com.