Do you know where your security holes are?

Qualys and McAfee lead the way in six-vendor test of automated tools that scan and report on vulnerabilities

We all worry that there's some lurking security problem in our servers. We do what we can, patching, following best practices, keeping up-to-date with training and news. But wouldn't it be great to have an automated tool to check our work? That's the promise of vulnerability analyzers: products that detect problems in configuration, applications, and patches.

Used correctly, a vulnerability analyzer can help you stay on top of hundreds or thousands of servers, network devices, and embedded systems. You'll know where to focus your efforts for security remediation, and you'll know that you have a system in place to keep little things from slipping through the cracks and becoming big things.

However, used incorrectly, these analyzers can generate thousands of pages of confusing information, frustrate security and network managers, and end up causing more problems than they solve.

We evaluated six market-leading products for their vulnerability scanner results, reporting features, product manageability, workflow tools, and interoperability with other enterprise products.

Two products stood out: SaaS-based QualysGuard VM, and McAfee's Vulnerability Manager, a software or appliance-based product.

SAINTmanager product line came in third, buoyed by a powerful scanner, but burdened by a weak GUI. Our favorite challenger, eEye Retina CS, paired a strong scanner with a newly-minted GUI. But we found a number of bugs and design flaws that need to be fixed before the product is ready for enterprise deployment. Retina is a relatively new product that is under active development. During the three months we were testing, we saw one upgrade of Retina, and eEye released another just before we went to press.

Compliance considerations

Critical Watch's FusionVM product, another SaaS-based offering, has some great ideas in it, but the execution is lacking. Lumension Scan, a product with a more limited scope, did a good job at what it was designed for, but didn't have the enterprise focus we were seeking.

Scanning for stuff

All vulnerability analyzers have a common core: a scanner that finds vulnerabilities. If you were doing a penetration test on a network, the scanner would be all that you need.

In some products, notably eEye's Retina CS and SAINT Corporation's SAINTscanner/SAINTmanager/SAINTwriter, the scanner is a standalone entity that you could run without the reporting and management tools. (See Web scanning as an option.)

In others, such as QualysGuard VM and Critical Watch's FusionVM, the scanner is inseparable from the other pieces . If you're a security consultant who wants to just perform scans, eEye and SAINT will fit your needs best.

To get an idea of how well the scanners worked, we scanned three production networks at three companies, plus a specially-constructed test lab network. On the test lab network, we deliberately let four servers fall behind in their patches by two months: two Windows systems (Windows 2003 and 2008), a Linux system, and an OS X server. (Read how we conducted our tests.)

Then we turned on the vulnerability analyzers and evaluated the results.

First, a word of caution: vulnerability scanners can and will cause instability in your network. SAINT and Critical Watch did real damage on our network, managing to lock up one of our production Unix servers, and causing SAN hiccups that interrupted service to several clients.

Virtually all of the products caused our APC UPSs to reboot, which (fortunately) didn't affect anything but the management interfaces. So, be careful what IP addresses you scan and how the scans are run.

You'd think our networks wouldn't have gotten too far out of date in only two months, but these scanners had a lot to say. The winner by weight was eEye, which dumped a 180-page report on our desk, although McAfee won by count, telling us 537 different things about those four systems, 380 of which weren't specific vulnerabilities, but only informational items. Still, that left 84 critical vulnerabilities that McAfee wanted us to fix.

An obvious conclusion was that some vendors weren't doing a good job of data reduction. Yes, it's true that Adobe patch APSB10-14 covers 28 distinct vulnerabilities, but they're all fixed by a single patch, and treating that as 28 separate incidents -- as McAfee does -- simply encourages confusion.

Critical Watch had a similar issue, delivering all of the pieces of some of Microsoft's critical patches as separate elements, even though the remediation task was the same: install MS11-012 to fix five separately reported vulnerabilities. The pedantic security wonk might insist on knowing about each separate issue, but that's what drill-down reports are for. By default, this information should be combined into a more digestible format.

Some results were very cut-and-dried. For example, on our test of Windows systems, we had a list of security updates and patches that Microsoft Update gave us and we expected to see each of those patches in the list of vulnerabilities for each system. With McAfee, Qualys, and eEye, we found everything in our checklist in their scan results. Critical Watch, Lumension, and SAINT didn't catch everything.

If that checklist of known missing security patches was our entire test, it would have been easy to rank products. But each of the scanners had lots to tell us, and figuring out whether or not the information was relevant or worthwhile was tricky.

For example, Lumension told us that we were running an out-of-date version of SecureCRT (true) on our Windows systems and rated the vulnerability as "high" (probably over-kill). No other product we tested picked up on this. Does this mean that Lumension is better than the other scanners, which didn't catch this problem?

Well, yes, except that eEye found a circa-1999 protection setting on an obscure registry key that could be used by privileged users to further escalate their privileges during system boot. Does that make eEye better than the other scanners, which didn't identify the registry problem? You can go in this circle forever, as each product called out issues, mostly minor ones, that the others didn't.

False positives

We also looked at false positives, places where the scanners reported a vulnerability where none existed. This is a sensitive area for security purists. Some scanners simply treated the existence of a particular file as making a system vulnerable. For example, Lumension marked the Unix kernel on our Linux test system as out-of-date, probably because it saw an installed Red Hat Package Manager with the old kernel. Except that we weren't running that kernel, so the system wasn't vulnerable. But the kernel was out there, spinning around on disk.

Of course, not all false positives were so difficult to judge. For example, CriticalWatch identified our Mac OS X system as missing a Microsoft Windows patch, and eEye wanted patches for VMware and HyperV vulnerabilities on systems that didn't have either installed. We identified false positives in every scanner but Qualys.

We had other challenges in comparing scanners. The CVE (Common Vulnerabilities and Exposures) database supported by the U.S. Department of Homeland Security is the closest thing we have to a common naming standard. We would have expected vendors to make sure that their vulnerabilities were properly matched to CVE database entries, but we found errors and omissions in most of the scanners when trying to compare results. For example, we thought McAfee had identified a problem no one else had caught with CVE-2010-3886, until we found it hidden in eEye as Retina Audit 13156, without the CVE number.

We also looked at two other important criteria: reporting on evidence of the vulnerability, and cross-platform support.

Where's the evidence?

We found evidence, or the lack of it -- a big differentiator. When a scanner reports a vulnerability, it is critical to have supporting evidence readily available. For example, one of the scanners reported a vulnerability of "user never logged in" on our Windows servers, without telling us which user it was. How useful is that? (Answer: not very.)

McAfee, Qualys and SAINT gave us great supporting evidence, with Critical Watch providing some evidence but not as much as we wanted. EEye and Lumension were deficient in providing evidence, although eEye did hide some of the evidence in reports that were invisible in the Web interface.

Scanning different operating system platforms was another differentiator which may not be important to every network manager. All products supported Windows, but we found varying levels of commitment to non-Windows systems, such as Mac and Unix, as well as infrastructure devices such as switches, databases and routers. SAINT, and to a lesser extent, eEye, had a stronger focus on Unix systems than the other products.

Overall, we felt that Qualys and SAINT had the strongest core scanners, with McAfee nearly at the same level. Lumension, eEye and Critical Watch had higher numbers of false positives and false negatives, as well as poor evidence presentation when documenting vulnerabilities.


Vulnerability information hidden inside a product doesn't do network managers any good. We expect a vulnerability assessment product to be able to report on the information it finds in a way that maximizes understanding and minimizes wasted time.

Network managers will probably want to use a GUI for their reports, because they'll be diving down into details, looking at individual systems and vulnerabilities and focusing on tasks such as remediation and patching.

Auditors and managers may want printed reports that summarize information, giving a big picture view of where the enterprise is at risk and where they need to focus attention. We looked at all types of output under the general category of "reporting," to evaluate understandability, transparency, configurability and usefulness.

We discovered that there are really two types of vulnerability analyzers out there: scan-based and asset-based. A scan-based analyzer is one where all the reports and data are focused on scanning jobs. In other words, the analyzer runs a scan job, and then you can report on that job.

Since the vulnerability scanner runs by looking at a list of targets, running some selected scans and generating output, it's very natural to build a scan-based vulnerability analyzer. You just take the scanner that you worked so hard on and stick a fancy GUI on top of that. The scanner may track information such as trends, but it's not maintaining a detailed history on each system being scanned.

The alternative to a scan-based analyzer is an asset-based analyzer. (An "asset" can be a server, workstation, router or whatnot.) Here the focus is not so much on scanning jobs, but on information that is collected about assets. Over time, the scanner picks up bits and pieces of information about assets on your networks, building up a picture of vulnerabilities, configurations, and even patching history. Some of the vendors have started to call this "scan once, report many" in their literature.

If you're running a single scan for a PCI audit or a penetration test, you'll be happy with scan-based vulnerability analyzers. However, if you're trying to incorporate vulnerability management as part of a continuous process, aimed at understanding the security posture across an enterprise network over time, you'll want an asset-based analyzer.

With this approach, you might take a long time to set up your scans, but you would expect to run them regularly and repeatedly, and contribute to a "body of knowledge" about your network.

An asset-based analyzer can always behave as a scan-based analyzer by simply reporting on the latest scan, and each of the products we tested did that perfectly. However, we had varying degrees of success getting products to zoom out of the scans and give us a big picture in their reports across multiple scans, over time.

Qualys came closest to our ideal of what perfect vulnerability analyzer reporting would act like, with McAfee nearly at the same level. Both let you generate easy to read and navigable reports, both scan-based and asset-based. The reporting interfaces in both are intuitive, and we strongly preferred the idea that reporting could be very detached from scanning.

However, neither product was perfect. Qualys wouldn't let us run automated reports. That's a side-effect of its strong SaaS security model. Because Qualys is holding extremely sensitive information in its databases about your network and systems, including credentials you might have provided for scanning purposes, Qualys takes security very seriously.

In this case, maybe a little too seriously. Its encryption model keeps anyone from looking at your enterprise data unless you're actually logged into the Web-based user interface. Unfortunately, "anyone" in this case also includes Qualys' report writer. It doesn't run when you're not logged in, which means that getting Qualys to generate reports and send them to you on a regular basis is about 100 times harder than it should be. You can do it, if you insist.

The Qualys team gave us information on the QualysGuard API, a tool that would let someone remotely launch reports, perhaps out of a batch job. However, that's not really what we were looking for. Qualys is proud of how seriously it takes the security of sensitive data, but this is one case where it needs to figure out a simpler way to perform a simple task.

1 2 Page 1
Page 1 of 2
The 10 most powerful companies in enterprise networking 2022