Four experts share the latest research-and-development news.
By Sandra Gittlen, Network World, 03/21/05
If you think re-architecting your IT infrastructure with new data center technologies will help protect your company over
the next decade - think again. Experts at academic and vendor research labs around the country agree the move toward an automated,
on-demand, virtualized computing environment will increase the complexity of security.
With the new data center, IT executives "won't be able to think of the enterprise as a castle with a drawbridge and one point of entry to keep the
bad guys out. They'll have to look at every node in their network, every computer in the network, as something to defend individually,"
says Dirk Balfanz, a researcher at the Palo Alto Research Center (PARC ) in California.
PARC is just one of many organizations focused on solving security problems that lie ahead. Among researchers' goals are reining
in complexity, improving identity management and developing platforms to protect digital assets.
"Complexity is the enemy of security," says Ed Skoudlis, an instructor at the SANS Institute in New York. "The constant drive for new innovation and new functionality introduces complexity. And this complexity introduces
Wireless networks, Web services and other emerging architectures will pose challenges for IT managers, Skoudlis says. "They'll need to apply security at
different layers in the network. You have security layers for the older technology, and then you'll need more for the new
architectures," he says.
Researchers at PARC concur, and so they have been researching ways to prevent users, confronted with security mechanisms at
too many layers, from ignoring or, worse, tampering with security procedures. "If a security procedure is too difficult, users
won't deploy it. They may configure it incorrectly, or they'll just switch it off," Balfanz says, citing a PARC study that
found enabling laptops with 802.1X security took users two hours. Rather than struggle, users stopped using the security features
of the network which jeopardized network data, he says.
PARC's Security Research Group developed an architecture that eliminates these frustrations for users while letting IT organizations
employ tighter security. With this architecture, users connect their laptop to an "enrollment station" via a close-proximity
technology such as infrared. After being authenticated by the enrollment station, the user gains network access and then receives
a digital certificate that automatically configures network policy settings on the laptop. The whole process takes less than
2 minutes, Balfanz says.
PARC is working with vendors to put this architecture in enterprise products.
Programming techniques, including those used in creating Web services, are the focus of security researchers at Cornell University
in Ithaca, N.Y. "Web services are being built with vulnerabilities" that could spread rapidly as Web services are shared across
networks, says Fred Schneider, director of the Information Assurance Institute at Cornell. "If you're building off of something else, you might not understand the properties of everything you're working
with. Security is not something you can evaluate by looking at the interface," he adds.
His research team, in cooperation with Intel and the Office of Naval Research, is working on a project called Language-Based Security . The aim is to put basic tenets of solid security, such as in-line reference monitors, information flow policies, proof-carrying
code and certifying compilers, into emerging technologies.
Schneider also advocates the use of safe systems languages that work much like the common C language. The hope is that bringing
these practical components into newer applications such as Web services will shore up flaws that extensible systems could
generate, he says.
Security and privacy
Ken Klingenstein, director at the Internet2 Middleware and Security Initiative , says he's not optimistic about the ability for complexity to be reduced. "Short-sighted solutions, deep vulnerabilities
and the nature of how complexity compounds over time means we're in for difficult times," he says.
But improvements are possible in the way resources are shared across the Web, while user privacy is protected, Klingenstein
says. "What seemed to be orthogonal goals of security and privacy can be achieved together," he says.
The Internet2 created the Shibboleth Project to address the need for simple, secure cross-organizational data access. The Shibboleth System, which comprises open source
identity provider and service provider components, lets network users access resources inside and outside their organizations
without suffering through multiple registration processes. They do not need to offer personal information unnecessarily.
"The goal is to have users release only the minimum amount of information to content providers to determine whether they are
eligible for that content," Klingenstein says. This will help reduce identity theft and fraud, he adds.
Using common security standards such as the Security Assertion Markup Language from the Organization for the Advancement of Structured Information Standards (OASIS ), public-key infrastructure and X.509, the Shibboleth System requires a content provider to employ the service provider
software and a user to be part of a network that uses the identity provider software. Say an academic goes to another university's
resource site that is running the Shibboleth service provider component. He will be required to select among organizations
that have site privileges. Once he chooses his university, the browser automatically will send him to his university's sign-on
page, which runs the identity provider software. The academic then logs on, as he normally would with the familiar name and
password and when he is authenticated, he is sent back to the destination resource site and is free to access information
|The biggest security challenges ahead
Six researchers answer the question, “What are the most critical security issues facing IT?”
|“A big challenge for IT managers will be cross-site scripting attacks. Hackers are able to corrupt SQL queries in Web forms
and preempt what an application is intended to do. We don’t have tools to stop that and it requires an understanding of what
the interface needs to do to block the attack.”
— Fred Schneider, director of the Information Assurance Institute at Cornell University, Ithaca, N.Y.
||“Radio frequency ID tags are going to be very easily hacked. That has serious implications because use of these tags is being
proposed in a lot of applications . . . . RFID technology should not be moved into high-security applications without attention
to authentication protocols and repudiation methods.”
— Mich Kabay, associate professor for information assurance at Norwich University, Northfield, Vt. (see www.nwfusion.com, DocFinder: 6323)
||“The trustworthiness of an automated system is tied to the integrity and quality of the information that is shared between
systems. This requires new trust models and identity constructs. In some instances, messages will be sent by entities that
are unable to provide strong credentials in the form of digital certificates. Automated systems will need to be able to infer
trust based on other types of indicators.”
— Bob Gleichauf, CTO, Cisco’s Security Technology Group
“Spam on wireless devices is going to be a significant challenge. CIOs are paying to download every megabyte of data across
multiple channels. They don’t want to bear that cost. These devices are just not secure enough for the mass market yet.”
— Dave Steer, director of segment marketing, ARM
“The privacy issue is going to be more complicated because Big Brother can know who is doing what, where, how and when. Integration
of biometrics in security is going to be much tighter. . . . And there are going to be more devices to secure and more systems
that are going to require different types of access and control. I’d like to think that there’s going to be distributed systems
where you can write policies to provide controls to these devices.”
— Sharon Besser, director of security solutions, Check Point
“The problem is that a majority of enterprise data is no longer on servers — it’s on the clients. The bad point of this is
that a lot of security threats emanate from clients. I’d worry less about the man-in-the-middle sniffing and more about the
endpoints and people walking off with laptops.”
— Charles Palmer, department manager for security, networking and privacy, IBM Research
Shibboleth already is being put through its paces within the enterprise. The Pennsylvania State University has tested the
system to allow students secure access to the college's Napster music service. Meanwhile, the grid computing community has
embraced Shibboleth as a key security technology.
On the flip side, content providers are panicked that the distributed nature of data centers - with detached devices and on-the-fly
extended enterprise partnerships - lessens their control and jeopardizes the security and integrity of data. In an era of
compliance and regulatory restrictions, this is unacceptable.
But content management researchers are working on digital rights management platforms that would solve this problem - securing
data even if it is out of the network's reach. IT organizations "need to understand where and how data is being used," says
Hari Reddy, director of business development at ContentGuard, founded by former PARC researchers. "How can you manage data
when it leaves your security framework?" He points to extended enterprises and the sharing of data as one complication. "I
may want to stop the specific usage of data [we've exchanged] at some point."
Companies must be able to map policies to data and have those policies act as authorization tools, he says. "People not only
want to manage how data is consumed, but also how it is distributed," he says.
ContentGuard continues to develop Extensible Rights Markup Language (XrML ), a de facto industry standard submitted to OASIS and other standards groups as the basis for any digital rights language
specification. XrML extends the range of rights-enabled business models for digital content and Web services. It lets IT managers
place detailed restrictions on content that is distributed beyond the enterprise walls and lets them manage the life cycle
of data and Web services.
For example, a company might place a control to distinguish those who are allowed to use the data or Web service vs. those
who can distribute it. "This allows the creator of the data to say, 'Alice has the right to view it, but Boston University
has the right to license and distribute it,'" Reddy says.
XrML is part of a multitiered distribution architecture that must feature a time element to limit how long data is valid,
Reddy says. A CFO could set an expiration date of 30 days past publication when distributing financials or a vendor sending
out a technical manual could mark the material as invalid after six months. This context and control is needed as data gets
further from its origin.
He says the content store, which used to manage data inside the network, would become the manager for data no matter where
Gittlen is a freelance technology editor in Northboro, Mass. She can be reached at email@example.com.
Error 404--Not Found
Security research center
The latest news, alerts, reviews and more.
Network World on Security newsletter
Security expert and educator M.E. Kabay keeps you up to date on what you need to secure your networks.
From RFC 2068 Hypertext Transfer Protocol -- HTTP/1.1:
10.4.5 404 Not Found
The server has not found anything matching the Request-URI. No indication is given of whether the condition is temporary or permanent.
If the server does not wish to make this information available to the client, the status code 403 (Forbidden) can be used instead. The 410 (Gone) status code SHOULD be used if the server knows, through some internally configurable mechanism, that an old resource is permanently unavailable and has no forwarding address.