Four experts share the latest research-and-development news.
If you think re-architecting your IT infrastructure with new data center technologies will help protect your company over the next decade - think again. Experts at academic and vendor research labs around the country agree the move toward an automated, on-demand, virtualized computing environment will increase the complexity of security.
With the new data center, IT executives "won't be able to think of the enterprise as a castle with a drawbridge and one point of entry to keep the bad guys out. They'll have to look at every node in their network, every computer in the network, as something to defend individually," says Dirk Balfanz, a researcher at the Palo Alto Research Center (PARC ) in California.
PARC is just one of many organizations focused on solving security problems that lie ahead. Among researchers' goals are reining in complexity, improving identity management and developing platforms to protect digital assets.
"Complexity is the enemy of security," says Ed Skoudlis, an instructor at the SANS Institute in New York. "The constant drive for new innovation and new functionality introduces complexity. And this complexity introduces flaws."
Wireless networks, Web services and other emerging architectures will pose challenges for IT managers, Skoudlis says. "They'll need to apply security at different layers in the network. You have security layers for the older technology, and then you'll need more for the new architectures," he says.
Researchers at PARC concur, and so they have been researching ways to prevent users, confronted with security mechanisms at too many layers, from ignoring or, worse, tampering with security procedures. "If a security procedure is too difficult, users won't deploy it. They may configure it incorrectly, or they'll just switch it off," Balfanz says, citing a PARC study that found enabling laptops with 802.1X security took users two hours. Rather than struggle, users stopped using the security features of the network which jeopardized network data, he says.
PARC's Security Research Group developed an architecture that eliminates these frustrations for users while letting IT organizations employ tighter security. With this architecture, users connect their laptop to an "enrollment station" via a close-proximity technology such as infrared. After being authenticated by the enrollment station, the user gains network access and then receives a digital certificate that automatically configures network policy settings on the laptop. The whole process takes less than 2 minutes, Balfanz says.
PARC is working with vendors to put this architecture in enterprise products.
Programming techniques, including those used in creating Web services, are the focus of security researchers at Cornell University in Ithaca, N.Y. "Web services are being built with vulnerabilities" that could spread rapidly as Web services are shared across networks, says Fred Schneider, director of the Information Assurance Institute at Cornell. "If you're building off of something else, you might not understand the properties of everything you're working with. Security is not something you can evaluate by looking at the interface," he adds.
His research team, in cooperation with Intel and the Office of Naval Research, is working on a project called Language-Based Security . The aim is to put basic tenets of solid security, such as in-line reference monitors, information flow policies, proof-carrying code and certifying compilers, into emerging technologies.
Schneider also advocates the use of safe systems languages that work much like the common C language. The hope is that bringing these practical components into newer applications such as Web services will shore up flaws that extensible systems could generate, he says.
Security and privacy
Ken Klingenstein, director at the Internet2 Middleware and Security Initiative , says he's not optimistic about the ability for complexity to be reduced. "Short-sighted solutions, deep vulnerabilities and the nature of how complexity compounds over time means we're in for difficult times," he says.
But improvements are possible in the way resources are shared across the Web, while user privacy is protected, Klingenstein says. "What seemed to be orthogonal goals of security and privacy can be achieved together," he says.
The Internet2 created the Shibboleth Project to address the need for simple, secure cross-organizational data access. The Shibboleth System, which comprises open source identity provider and service provider components, lets network users access resources inside and outside their organizations without suffering through multiple registration processes. They do not need to offer personal information unnecessarily.
"The goal is to have users release only the minimum amount of information to content providers to determine whether they are eligible for that content," Klingenstein says. This will help reduce identity theft and fraud, he adds.
Using common security standards such as the Security Assertion Markup Language from the Organization for the Advancement of Structured Information Standards (OASIS ), public-key infrastructure and X.509, the Shibboleth System requires a content provider to employ the service provider software and a user to be part of a network that uses the identity provider software. Say an academic goes to another university's resource site that is running the Shibboleth service provider component. He will be required to select among organizations that have site privileges. Once he chooses his university, the browser automatically will send him to his university's sign-on page, which runs the identity provider software. The academic then logs on, as he normally would with the familiar name and password and when he is authenticated, he is sent back to the destination resource site and is free to access information there.
Shibboleth already is being put through its paces within the enterprise. The Pennsylvania State University has tested the system to allow students secure access to the college's Napster music service. Meanwhile, the grid computing community has embraced Shibboleth as a key security technology.
On the flip side, content providers are panicked that the distributed nature of data centers - with detached devices and on-the-fly extended enterprise partnerships - lessens their control and jeopardizes the security and integrity of data. In an era of compliance and regulatory restrictions, this is unacceptable.
But content management researchers are working on digital rights management platforms that would solve this problem - securing data even if it is out of the network's reach. IT organizations "need to understand where and how data is being used," says Hari Reddy, director of business development at ContentGuard, founded by former PARC researchers. "How can you manage data when it leaves your security framework?" He points to extended enterprises and the sharing of data as one complication. "I may want to stop the specific usage of data [we've exchanged] at some point."
Companies must be able to map policies to data and have those policies act as authorization tools, he says. "People not only want to manage how data is consumed, but also how it is distributed," he says.
ContentGuard continues to develop Extensible Rights Markup Language (XrML ), a de facto industry standard submitted to OASIS and other standards groups as the basis for any digital rights language specification. XrML extends the range of rights-enabled business models for digital content and Web services. It lets IT managers place detailed restrictions on content that is distributed beyond the enterprise walls and lets them manage the life cycle of data and Web services.
For example, a company might place a control to distinguish those who are allowed to use the data or Web service vs. those who can distribute it. "This allows the creator of the data to say, 'Alice has the right to view it, but Boston University has the right to license and distribute it,'" Reddy says.
XrML is part of a multitiered distribution architecture that must feature a time element to limit how long data is valid, Reddy says. A CFO could set an expiration date of 30 days past publication when distributing financials or a vendor sending out a technical manual could mark the material as invalid after six months. This context and control is needed as data gets further from its origin.
He says the content store, which used to manage data inside the network, would become the manager for data no matter where it flows.
Gittlen is a freelance technology editor in Northboro, Mass. She can be reached at email@example.com.
Learn more about this topicSecurity research center
The latest news, alerts, reviews and more.Network World on Security newsletter
Security expert and educator M.E. Kabay keeps you up to date on what you need to secure your networks.