Americas

  • United States
by Betsy Yocom, Randall Birdsall And Diane Poletti-Metzel, Network World Global Test Alliance

Gigabit intrusion-detection systems

Reviews
Nov 04, 200211 mins
Intrusion Detection SoftwareNetwork SecuritySecurity

Tests show network IDS products have a ways to go to get accurate detection at gigabit speeds.

In our tests of five leading network intrusion-detection systems and the popular open source Snort, performance was spotty during baseline testing and degraded by as much as 50% on some products when we opened the throttle to gigabit speeds.

In our tests of five leading network intrusion-detection systems and the popular open source Snort, performance was spotty during baseline testing and degraded by as much as 50% on some products when we opened the throttle to gigabit speeds.

Our first step was to run 28 well-known attacks against each product in an untuned state on a wire that had no other traffic running on it. Most products detected only about half the attacks.

When the systems were tuned, most products caught an additional two or three attacks, but still missed a good number of them.

IntruVert’s IntruShield 4000 was a bright spot. It detected the greatest number of attacks in every test (see the performance chart), and wins the Network World Blue Ribbon Award. A newcomer to this market, IntruShield is a well-designed, and feature-rich.

Internet Security Systems’ RealSecure Gigabit Network Sensor Version 7.0 didn’t detect as many attacks as IntruVert’s product overall, (16 out of 28 at baseline with no tuning and 25 with tuning), but deserves the runner-up prize because its ability to detect attacks did not change at gigabit speeds. The other three commercial products tested at Miercom’s lab facility in Princeton Junction, N.J., were Dragon IDS Server Appliance and Dragon IDS Sensor Appliance; Intrusion’s Intrusion SecureNet; and, Symantec’s (formerly Recourse) ManHunt Version 2.11. We also tested the open source package, Snort on Acid.

Our primary focus was to determine how well these products performed under a gigabit traffic load, which was 970M bit/sec in our tests. We ran the tests at slightly less than a full gigabit load to ensure that the link was not overutilized and all our attacks could get through (see How we did it).

In our baseline tests with no traffic, we did not tune the systems in any way, but we did turn on all signatures and protocol anomalies. We delivered 28 attacks to each system, including commonly known denial-of-service, surveillance and probe attacks, and attacks, such as Stick and Fragrouter, designed to evade an IDS system (see Attack List).

IntruShield 4000 detected the highest number of attacks – 24 out of 28. Dragon, RealSecure and Snort each caught 16 of the 28 attacks. ManHunt detected 14 attacks, and SecureNet caught 11.

A key factor in IntruVert’s strong showing is a good implementation of signature-based attack detection (in which packets’ contents are compared against a database of known attack patterns), and protocol anomaly detection (PAD) (in which the product verifies that a traffic flow is not violating its defined protocol – signaling suspicious activity.). Except for Snort, all the products supported both techniques, but IntruVert married the two technologies especially well.

Overall, the products caught about half the attacks. What accounts for this lackluster showing on a nontuned system is that some vendors turn off signatures to heighten performance. Vendors also make this trade-off so that administrators are not overwhelmed by the many false-positive alarms they receive before systems are tuned (see review).

The extent of an IDS’s signature database also is a factor. The more attack signatures a product supports, the better its rate of detection without system tuning. For example, although ManHunt supports a signature database and PAD, it relies more on the latter. If an exploit or attack follows protocol then it’s not detected unless the product has a signature to catch it. ManHunt supports a small signature database and doesn’t do as well in this type of test.

We next ran the set of attacks that each product detected at baseline traffic levels against these still untuned products, but filled the pipe. RealSecure caught 16 out of the 16 attacks, ManHunt caught 13 out of 14 attacks, and IntruShield 4000 caught 21 out of 24 attacks. Dragon and Snort had the poorest overall showing, catching only 3 out of 16 attacks and 6 out of 16 attacks, respectively.

Tuning helps, but not much

This same set of tests was conducted again on tuned systems. Tuning meant that the vendor could tweak any signature code included in the product’s database to let it catch or identify an attack correctly. The vendor could change User Datagram Protocol (UDP) to TCP, or vice versa to catch a Back Orifice attack. Or it might decrease thresholds for the number of TCP connects to catch an NMAP attack. It also might turn off processor-intensive engines, signatures and features to enhance performance.

Some products were tuned in an hour; some took much longer. How familiar the vendor’s technical representative is with the system and tuning it plays an important role – something end users should consider. What type of assistance does the vendor provide for system tuning is a good question to ask.

Vendors were not allowed to add new signatures to a database, customize existing signatures or download signatures from the Snort database to use during testing. In a customer deployment, customers presumably would take advantage of all of these tuning options. We did not allow this to happen in the lab because we would then be assessing how well a vendor’s on-site technician could tune a system rather than evaluating gigabit performance, which was our intent.

Overall, tuning helped the products detect a few more attacks, but the increase was not dramatic. At baseline traffic levels and tuned, the IntruShield 4000 caught all 28 attacks. RealSecure benefited the most from tuning, detecting 25 out of 28 attacks – compared with only 16 without tuning. Snort improved from 16 to 18 and Dragon went from 16 to 17. Tuning had no effect on ManHunt, which detected 14 attacks in both rounds.

When we threw gigabit traffic at these tuned boxes, again using only attacks the boxes could catch at baseline, RealSecure caught 25 of 25 attacks delivered at gigabit. IntruShield 4000 caught 27 out of the full 28. ManHunt caught the same number of attacks (13 out of 14) it had on an untuned system.

SecureNet’s performance was much improved by tuning. In the baseline tests on a untuned system, SecureNet caught only four out of 11 attacks at gigabit speeds, but caught 14 out of 15 attacks when tuned.

Dragon and Snort showed some improvement on a tuned system. Dragon and Snort each caught eight attacks (out of 17 and 18 baseline attacks, respectively). All the attacks used in these tests were legacy (no variants), and these systems should have been able to detect them. But many products missed common attacks – most often because a signature was not correctly written or absent from the product’s signature database altogether.

All the vendors let customers add their own signatures to the IDS database and all provide signature customization to improve performance. But based on these results, it’s clear that good attack-detection rates require significant time and tuning to achieve.

Management

While the focus of this review was on performance, we also assessed management applications on the products, observing them under heavy load.

A Web-based management graphical user interface (GUI), supported on Dragon and IntruShield 4000, allows management of the product from any venue. On the Dragon, having the Web-based management and event viewer on the same Web page also facilitates interacting with the device.

But the Dragon’s management application had some downsides. We couldn’t easily clear real-time events from the screen after they were viewed, and “pushing” the configuration out to the devices was unintuitive, requiring that we complete several steps on multiple screens.

The IntruShield 4000’s Java-enabled Web-based GUI is intuitive, logically designed and clean. The product supports informative graphics for about anything we wanted to track, producing them was easy – just a click on a “graph” button and they were delivered automatically.

SecureNet had the best drill-down granularity, giving users the ability to sort events via an event tree they create. However, using three different tools to manage the SecureNet sensor cumbersome, and many of the applications have a different look and feel, which affected ease of use.

A key feature on RealSecure was the display of packet data that the IDS tags as suspect on the top-level event description. Other products did this, but not on the main screen. RealSecure also provided good icons and visual alerts to denote the priority of alarms.

Setting up and managing Snort is not for the technically challenged. Snort runs with a variety of other packages – a plus – but many of those might not work easily with one another, so familiarity with applications such as My SQL and Apache Web server, and Unix, are highly recommended. Extensive documentation and installation instructions exist on the Internet for Snort (go to www.snort.org), but you’ll need to search around to find the appropriate tools to get this product up and running.

We ran Snort on Acid, a free, open source event viewer. Searching through events on Acid was powerful, but not intuitive (we had to search using SQL syntax), and there are many search options with which to work. Acid is for event viewing only; all parameter and configuration changes were done through the command line.

Because Snort is based on open source code, it is highly configurable and customizable, but there are no formal technical resources on which to rely when using this product (and no one to blame if things go awry). There are companies that provide Snort tech support commercially, which you might want to consider in an enterprise environment.

ManHunt’s GUI was the easiest to use, and although it lacked the granularity of some of the other products, it got the job done and provided some useful features. On the down side, the ManHunt GUI was slow to accept changes and provided no indication they had taken effect.

Features

While all the products support a range of features, including paging alerts, high availability and integration with firewalls, there were a couple of standouts in this category.

IntruShield 4000 supports an outstanding feature set. Chief among them is the IntruShield 4000’s ability to “learn” the network, based on the vendor’s proprietary learning software, also known as statistical anomaly analysis. With a solid view of how the network looks over time, the product can quickly detect when something different takes place.

ManHunt supports an outstanding coalescing feature that ties a collection of connected events together. If an IDS detects 1,500 instances of the same attack, it reports it as one. ManHunt’s coalescing function was impressive because it can see different types of attacks that are part of the same incident and put them together.

We took a brief look at Intrusion’s new SecureNet Provider 2.1 – which included an enhanced forensics tool that offered several impressive data-mining options. This new tool supports a “case files” application that lets you drag and drop noncontiguous events and ranges of events into case files. We could add notes, links and URLs to the case files and also save or export them to other applications for sharing the data, archiving or conducting further analysis.

The IntruShield 4000 also had the strongest showing in the configuration category with its support of optional, redundant hot-swappable power supplies and four redundant fans.

A plus for Snort in the configuration category is its ability to operate on just about any operating system. It also supports numerous options for logging events, including multilogging in plain text, syslog, MySQL and SNMP traps, among others.

You get what you pay for

Gigabit IDS products have a way to go before they offer full protection against attacks without significant tuning. Without it, most of these products detected only half of the attacks directed at them, many of which have been around long enough that these systems should have detected them.

The IntruShield 4000 delivered an impressive showing in this review, but it comes with a hefty price tag – $108,000 for the system we tested, compared with only $15,000 for Dragon and $18,000 for SecureNet.

When you compare the percentage of attacks detected without tuning vs. total price of the IDS systems as tested, you’ll find that you get what you pay for. The higher the price of the system we tested, the better the overall attack detection rate without significant tuning required and vice versa.

If you plan to use a gigabit IDS at a critical, high-traffic area of the network, plan to pay a price – either in the time it will take to tune the system for optimal performance or in dollars to get a product that has been engineered for high traffic loads.

Yocom is editorial manager, Birdsall is test lab engineer and Poletti-Metzel is test lab manager at Miercom, a network test lab and consultancy in Princeton Junction, NJ. They can be reached at byocom@miercom.com; rbirdsall@miercom.com and dpoletti@miercom.com.