Americas

  • United States

SSL VPN How we did it

Reviews
Dec 19, 20056 mins
AuthenticationNetwork SecurityNetworking

An outline of how we conducted the Clear Choice Tests of SSL VPN gear.

Testing SSL SSL VPNs is a difficult and involved process. We needed authentication servers, application servers and a heavy-duty testbed that would let us evaluate what worked and what didn’t. As part of our invitation to participate in the test, we wrote a testing methodology and circulated it for comments among the SSL VPN development community (see requirements for SSL VPNs).

We started by building our simple security policy based on a company with three main groups of users. Each group had different security and application requirements that we hoped would help exercise the products we were testing, as well as show their differences.

In writing the security policy, we mixed fine-grained access control (for example, some Web servers were partially accessible to some groups), end-point security policy (for example, Windows users must have some anti-virus with recent virus definitions, or they have access only to a subset of resources) and different types of resources to be protected by the SSL VPN device, including Web-based applications, mail servers, Windows Terminal Services, Citrix Presentation Manager, Telnet and Secure Shell (SSH) servers, Windows’ SMB and FTP file servers, and our VoIP network.

Our test plan comprised seven phases (see complete test plan).

We started by evaluating how well each SSL VPN device worked with our authentication servers (see results). We set up five authentication systems, including a RADIUS server, a Sun iPlanet LDAP server, a Windows Active Directory domain, an RSA SecurID authentication server (formerly called the ACE server), and a small PKI based on OpenSSL with digital certificates. Each SSL VPN device was tested to see whether it could authenticate against each of these systems and retrieve group information.

In the second phase, we tried to implement our security policy in the SSL VPN device. We wanted the SSL VPN devices to enforce the paper policy we had started with. While security policy can be swayed by what is technically possible, we were purists and came up with our policy without any concern about what the products could actually do. We started by just trying to include the basic policy of access control based on groups and when we got that part registered in the devices, we added on end-point security policy (see results).

Our third phase was interoperability testing. Each of the 135 test cases of browser, platform and application was separately tested, and the results were logged (see results). Usually, in this phase, we had quite a bit of back-and-forth with the vendor’s technical support department, trying to figure out whether the problems we saw were the result of bugs or configuration errors.

To test client interoperability, we had six client systems in our test labs. Two Dell laptops ran Windows XP, one logged in as administrator and one logged in as a non-privileged user. Both XP environments also had two browsers to test with: Internet Explorer 6 and Firefox 1.0.7. We also used an IBM Thinkpad running Windows 2000 and Internet Explorer 5.5 as a test system. We installed Firefox and Safari browsers on an Apple PowerBook running OS X 10.4 to check Mac compatibility. For PDAs, we used a Treo 650 (PalmOS) with Version 4 of the built-in Blazer Web browser, and a Nokia 9500 (Symbian) phone with the built-in Opera browser.

For application testing, we set up a wide variety of applications that might represent typical enterprise choices.

We set up nine Web servers with different applications. Our test servers had:

  • Two JavaScript-based monitoring systems, including IPswitch’s What’s Up, (to test JavaScript rewriting).

  • A slow Webmail application using the open source Squirrelmail application (to test how well these devices worked with slow Web servers).

  • Two embedded device Web servers from APC and Avocent (to test how well these devices worked with very constrained Web servers).

  • Simple HTML-only Web servers using HTTP and Secure-HTTP (to test how well these devices worked with differing back-end SSL certificates).

  • Microsoft Exchange’s Outlook Web Access and IBM’s Domino Web Access (to test how well these devices worked with complex Web applications).

To test how well these devices worked with Citrix and Windows Terminal Services, we set up a network of four Windows 2003 servers, including an Active Directory server, an Internet Information Server to act as a Citrix Web portal, and two servers to provide Citrix Presentation Server and Windows Terminal Server applications.

We also brought mainframe servers into the test bed with Telnet, SSH and FTP servers on them, as well as a Windows 2003 server with both Common Internet File System (Microsoft file sharing) and FTP file servers. Finally, to test the network-extension client included with the products under test, we installed a Session Initiation Protocol soft-phone application from CounterPath (formerly Xten Networks) on each of the test systems and linked it to our existing SIP telephone system.

In the fourth phase, we evaluated the end-point security features of the product by testing how well they worked on our various platforms and how well they were able to match our simple security policy. We looked to see whether end-point security could detect our corporate standard anti-virus, Sophos. In our library-and-Internet-café test, we set up the devices to detect any anti-virus (if they had that capability). Once the end-point security scan results were in, we evaluated how well the SSL VPN devices were able to use that information as part of enforcing security policy. We also looked at the protective services, such as virtual desktops and cache cleaners, available on the device specifically as an adjunct to end-point security (see end-point test results).

In the fifth phase of testing, we used our experience with management and product configuration tools to analyze and summarize management, accounting, auditing, reporting, and other aspects of product operation and configuration (see story on manageability testing). We also looked at user workplace and portal functionality (see portal test results).

The sixth phase we evaluated high availability and scalability capabilities for each product (where vendors elected to participate in this testing). We looked at how well each product’s high availability and scalability features worked by testing multiple failure scenarios and usage environments, including both Web-based and network-extension SSL VPN clients (see complete test results).

Finally, in the seventh testing phase, we evaluated product performance. Results of that testing will appear in a future Network World story.

Network World will publish results of an additional test specifically to evaluate the quality of VoIP-over-SSL in January.

Thanks to vendors

Special thanks goes to the vendors who supplied components of our test bed. Those include VMware for our VMware licenses; Citrix, for software and providing a heavy dose of technical support to get our Citrix farm installed; Microsoft, for providing Exchange, Windows 2003 and Terminal Server licenses; Avocent, for the KVM that let us keep track of 12 different application servers; APCON, for the automated patching system; Spirent, for providing benchmarking hardware; Apple Computer, for providing the Powerbook laptop; and RSA, for providing SecurID tokens and authentication server.

Portal control | Next story: The perfect SSL VPN >