Information AND network protection: Finding the right mix

How to secure critical and regulated data when network defenses aren't enough

New means of information protection, proposed by Steve Bellovin and the Jericho Forum, among others, must be deployed to complement perimeter enterprise defenses.

For years, with organizations increasingly opening their networks and data centers to external business partners and mobile employees, experts have been claiming that the perimeter is dead. At the very least, perimeters are riddled with enough holes that restricted data from the creamy center is leaking from endpoints and pouring out of databases and file-shares.


This story is part of a special Security Trend Watch issue, in PDF format. Download now.


The industry, of course, is still stinging from the most notorious example of this - the TJX Companies case. An ongoing Secret Service investigation resulted in last August's indictments of a ring of 11 attackers that also had been in the transaction-processing systems of six other brand-name retailers - some of them hidden since 2004. As a result, the criminals compromised nearly 45 million credit and debit accounts.

The porous perimeter needs protection from more than the bad guys attempting to make a buck off stolen credit card numbers: It needs protection from the gung-ho employee who, while trying to get some extra work done at home, inadvertently sends restricted material across the Web.

"A typical organization has lots of connections through its firewall - customers, Web services, suppliers, outsourcers," says Steven Bellovin, professor of computer science at Columbia University and co-creator of the Usenet online discussion system. "We haven't been protecting this data effectively enough. And I'm asking the community, 'What should we do differently?'"

Bellovin raises the notion of security at the center to protect against attacks getting to critical data in databases and file-shares. This idea is similar in many ways to The Open Group's Jericho Forum, which advocates assigning priorities to data, focusing on the most critical areas, and applying secure communications and encryption around these classified resources.

Neither Bellovin nor the Jericho Forum is suggesting organizations do away with their edge security. The perimeter, which serves an invaluable role in filtering the "noise" of network-based attacks, can be tuned to serve more data-centric functions. Nor are they claiming to simplify the processes of information protection. If anything, their approaches mean creating more layers, complexities and choices to be made around best-of-breed and point-product integrations.

"The problem is we don't look at data holistically. Consequently, data breaches are all over the news," says Jeff Boles, director of validation services at server and storage consultancy Taneja Group. "The way to get there is to look at a resource being accessed in context of the relationship between who the user is, what the user normally does, and the nature of the data." 

A holistic approach to critical data protection would suggest integrated options for IT pros trying to cross the chasms between data that is structured and unstructured, at rest, in use, and in motion. Unfortunately, the jobs of prioritizing, encrypting, monitoring and controlling the access to and use of sensitive data are anything but integrated. As a result, organizations are taking a variety of approaches to protect their data from flowing out of their organizations, including data loss prevention (DLP), access controls and encryption.

Gooey center

To get started, organizations need to know which data needs protection, and how to locate it - the cornerstone of the Bellovin and Jericho models.

Too many organizations, however, don't know what and where that data is, says Derek Brink, vice president and research fellow at Aberdeen Group. In an Aberdeen survey of 120 IT security professionals released in May, 50% of the best-in-class respondents had discovered and classified their critical information.

"You don't want to spend the same money protecting e-mail to the family about Sunday's barbecue as you do [protecting] your financial data," Brink says. "You only want to protect the resources that matter. But classifying those resources is the real challenge."

San Diego's Sharp HealthCare, with 16,000 employees at seven hospitals and two medical groups, is one enterprise well on the way. It uses a variety of manual and automated processes to understand and manage its critical data, says Starla Rivers, technical security architect.

Sharp uses Symantec's Vontu Data Loss Prevention product suite to discover critical unstructured data, such as health identification-card and Social Security numbers. Vontu does this by fingerprinting that data in a few key databases in which Health Insurance Portability and Accountability Act-specified, financial and other regulated data is processed. Then it looks for instances of that data outside the database on file-shares and endpoints. (Compare DLP products.)

In keeping with the Bellovin and Jericho theories, DLP tools are best used when they monitor for the least number of data types necessary, say DLP vendors. So, Vontu doesn't need to tag every type of data in a critical database for its initial scan. People generally tag the top five or six data types requiring protection. Like Sharp, most organizations start by classifying and protecting their regulated customer and reputational data, according to Aberdeen survey findings.

Vontu discovers sensitive data on network file-shares, tracks data movement at the endpoints and enforces group policy around that data. Sharp needed a second product, however: Varonis Systems' Varonis DatAdvantage, for governance and auditing. (Compare Network Auditing and Compliance products.)

"Group A may have 120 people, and I want to assist the department's data owner in determining the appropriateness of the individual, not just the group, with access to the folders containing sensitive data. That means determining who is accessing the folder, how often, and whether or not he should have those privileges," Rivers notes. "Our challenge now is tightening these permissions. Right now we're using Varonis to assist us in that."

Once the Vontu agent determines that a folder contains sensitive data, Rivers provides the file list to the managers accountable for that data. In turn, these managers are responsible for determining whether the folders and the files contain the minimal amount of information necessary to conduct the business function. They are expected to think in terms of records, fields, people and time, she says.

Rivers also uses the Varonis and Vontu tools to analyze regulatory rules for retention and other processes for which a single blanket policy is difficult to write. "We have so many regulations to follow here, and there is no one data-retention rule that I can write a policy to," she says.  "Some departments shouldn't be storing sensitive data at all, whereas other departments may need to keep the data for 10 years."

The IT group and business unit managers can learn from the analysis provided by the Vontu and Varonis tools, Rivers says. Meantime, user education comes through e-mail and the pop-up alerts Vontu delivers when policies are violated. As a result, employee-use violations have decreased by 70% since the system was implemented in 2007. And Sharp's staff members have even used the system to educate partners sending inbound information of a sensitive nature.

Taneja's Boles refers to data protection models such as Sharp's context based data controls. A lot of companies play in the classification space, he says, naming Abrevity, Kazeon Systems, Mimosa Systems and StoredIQ. It takes finesse by user organizations, however, to get to this next level of context-based controls through benchmarking data-use and monitoring outbound data flows.

Web and endpoints

Network-based DLP devices fit Bellovin's model of placing security closer to the database. So too do database application firewalls, such as those from Guardium and Imperva, for hardening, discovery, classification, monitoring and auditing.

Bellovin has reason to worry about protecting the database, particularly when it comes to its relationship with the Web server, says Richard Rees, security solutions director at SunGard Availability Services, a provider of information availability and business continuity services. "When we do penetration testing on clients' Web servers, we don't care about the server except as an avenue back to the data on the database," Rees says. "We find all types of vulnerabilities that can be exploited to do this - SQL injections, cross-site scripting attacks and so on."

Bellovin has a fix in mind. He proposes a Web SQL language called "NewSpeak," in which no verb can be ordered to do something insecure. "No command can say, 'Give me the credit card number.' This is not something the Web server needs to be able to do. Instead, it should say, 'Here's the total amount. Send this transaction to billing,'" Bellovin explains. "There shouldn't be verbs to dump the database or read the credit card."

By rewriting commands, developers would be hardening the Web applications. This, however, requires teaching developers to think in language that not only can't be tricked but also is understood explicitly by the database - something that's not likely to happen overnight, analysts say.

Bellovin also suggests taking the authentication role from the Web server and in so doing, removing the credentials to every account in the database. Instead, he recommends user-level authentication. This probably would be managed through a federated-identity model, such as is used by companies such as TriCipher, which provides Web authentication for e-business applications. (Compare Identity Management products.)

Meanwhile, the Jericho Forum argues that access should be controlled by the security attributes of the data itself. This could be facilitated through encryption, with rights being temporary, limited to that session.

"What I'm proposing is authentication accompanying every SQL command from the user, through the Web server to the database," Bellovin explains. "The database server won't respond to any request for user records if the request doesn't have a password. Even if I hack the Web server, I can't get into your account because I can't find your password. It's known only to you and the database."

Imperva and other database-protection products could support such an architecture as long as they combine protection mechanisms - heuristic, correlative or signature - says David O'Berry, IS director at the South Carolina Department of Probation in Columbia. They also would have to be based on a simple valid/invalid request-response-transmission/transaction system that could be checked at every leg of the transmission.

"What Steve [Bellovin] is talking about is really concentric layers," SunGard's Rees says. "We can't do away with firewalls and [intrusion-detection systems] at the perimeter because they do a great job of protecting networks. They don't do a good job of protecting applications."

Besides monitoring their database and network for classified data, organizations need to protect against data leaking out at the endpoint.

To this end, endpoint-protection companies have been integrating DLP into their product suites, often through acquisition. Besides Symantec, which closed its Vontu acquisition last December, endpoint-DLP deals include Trend Micro's October 2007 acquisition of Provilla and McAfee's recent purchase of Reconnex. Now these companies' DLP portfolios include gateway-monitoring devices, as well as endpoint agents that feed data into a reporting console.

DLP companies also are expanding their portfolios with encryption - another layer of data protection necessary under new security models. Sophos, for example, recently acquired Utimaco, a German data security company, and McAfee bought SafeBoot last fall and made data encryption centrally manageable. Using such tools, organizations can uphold policy on the endpoint, for example, "encrypt when downloading to a USB device."

"The endpoint really must evolve to be the flexible, resilient hard perimeter, or the skin on the network," says South Carolina's O'Berry, who's evaluating McAfee's Reconnex iGuard in tandem with his deployment of McAfee's endpoint DLP agents, and using Safeboot for endpoint encryption. "The endpoint is what the criminals are most aiming for because they're making a lot of money off hacked, remotely controlled computers, keyloggers and phishing attacks against end users."

O'Berry's probation department supports more than 750 mobile, convertible tablet users, along with connections to other law-enforcement and social services agencies. "Those tablets log in from various nontraditional locations, including home networks, to insecure, open wireless networks wherever they're available."

Another enterprise, Signal Financial Credit Union, reports having stopped 98% of its data leakage problem using DLP at the gateway and endpoints. The company uses Code Green Networks' Content Inspection appliance at network egress points to inspect and enforce protections on outbound e-mail traffic, create tickets, and manage rules and roles, says Steve Jones, CTO at the Kensington, Md., organization.

To expand DLP capability on the network, Jones uses Blue Coat Systems' ProxySG appliance to proxy other outbound flows, including SSL traffic that it decrypts with an optional SSL decryption card. Outbound data transfers often hide in the commonly used SSL protocol.

"The DLP device is monitoring everything going out, looking for account information, card numbers and several other data types that we've deemed critical," says Jones, who also uses Code Green agents on his endpoints to prevent leakage through USB ports and wireless connections.

Ultimately, security of critical data will occur at flow and use points across the enterprise and beyond, O'Berry says. This, he adds, essentially means layering additional protections at the database, the endpoint, the network and Web.

Bellovin has the bottom line: "We need to think about the problem in a different way because what we're doing [with perimeter protections] isn't working. What we need is a more data-centric architecture with strong protections around the important data because security holes in the perimeter are inevitable."

Radcliff is a freelance writer covering computer crime. She can be reached at deb@radcliff.com.

Learn more about this topic

User-centric security begs for process overhaul

Security Trend Watch: the latest in enterprise defenses PDF

Reader survey: Top security trends Slideshow

Experts debate NAC: usefulness vs. cost Chat

Three IT technologies that matter for 2008

Identity monitoring and core security 

Compare Network Access Control products IT Buyer's Guide

Join the discussion
Be the first to comment on this article. Our Commenting Policies