How we tested Exchange 2007

To test Exchange 2007, we made heavy use of VMware virtual machines. We started by creating a separate Windows Active Directory domain on a Windows 2003 system. Then, we installed Exchange 2007 on five virtual servers in various configurations, including

Some of our early tests were of the high-availability and scalability features.

Once we had a single server acting in several Exchange roles, including mailbox server, transport between mailboxes, and client access server for Web users and mobile devices, we attempted to add more servers into the Exchange environment to test high availability.

We added a set of two systems as a cluster to test the high-availability capabilities of Exchange 2007 running in Active/Passive mode across multiple systems.

We also tested high availability by adding a fourth mailbox server running the local replication service to multiple disk drives.

To further test the high-availability functions, we used the features of VMware to stop systems abruptly and remove disk drives from under the various operating systems, and we recorded how Exchange 2007 behaved during these unusual events.

For our management and architectural evaluation, we included both installation and configuration of the system as a whole in our tests. We also tested a number of maintenance and operations procedures, including moving data stores, adding and deleting users, and enabling and disabling services. We worked with Microsoft technical support, online Web knowledge bases and the built-in documentation as well.

We finished our installations by adding Exchange 2007 on a fifth system to test the new Edge Transport role and Forefront Security antivirus and antispam inside of Exchange.

To evaluate Exchange 2007 in an Edge Transport role we used open source tools (nmap, PeachFuzz and conventional network utilities) in a CentOS 4 Linux environment running in a VMware virtual machine to look for listening applications on the Edge Transport server, and to launch a wide variety of attacks on the server we found listening. The Web-based documentation was used as the source for vendor security guidance during the evaluation.

To test antispam functions, we used a variation on our 2004 antispam test, sending a real-time stream of about 11,000 e-mails from our normal corporate e-mail feed through the Exchange 2007 server as they were received.

Following Microsoft's guidelines, we configured Exchange 2007 to mark messages as "definitely spam" (those with a spam confidence level of 7 or above), as "suspected spam" (those with a level of 5 or above) or "not spam" (spam confidence levels below 5). We hand-sorted each message into a "spam" or "not spam" and then compared our manual ratings with the ones that Exchange 2007 provided. We also ran the same stream, at the same time, through three other commercial antispam products to see how Microsoft's antispam technology would compare.

We reported results as ranges. Because each product has a "suspected spam" category, one end of the range includes "suspected spam" as spam for the purposes of calculating false-positive and false-negative results. The other end of the range assumes that suspected spam is not counted as spam.


< Return to main story

Learn more about this topic

Compare Exchange 2007 with competing messaging platforms via the NWW Messaging Buyer’s Guide

Spamhaus case could test ICANN

10/12/06

Yahoo tests antiphishing service

08/18/06

Antispam firm says it was victim of attack

05/05/06

Copyright © 2007 IDG Communications, Inc.

The 10 most powerful companies in enterprise networking 2022