Americas

  • United States
by Paul Desmond

Testers’ best tips

Feature
Feb 23, 20047 mins
Wi-Fi

Network World Lab Alliance members share their secrets for conducting meaningful product tests.

Finding the products that will best fit in your network is a multifaceted process that involves scanning vendor Web sites, devouring white papers, attending trade shows and reading trade publications.

But once you come up with a short list, you might find that conducting a product test is the only way to make your final decision.

Product testing is something of an art, as we confirmed after polling the 14 members of the Network World Lab Alliance for tips on their trade. The lab alliance is the group of industry experts that conducts the product reviews you read each week in Network World. For this special issue, lab alliance members offer advice for your testing efforts that includes how to develop sound methodologies, acquire the gear you’ll need and run tests that yield accurate results.

A valid test must meet three requirements, says David Newman, president of Network Test in Westlake Village, Calif.: It must be repeatable, stressful on the equipment or software under test, and meaningful. The last criteria is the most difficult to achieve, he says, but the secret is, “Test like you deploy and deploy like you test.”

While that might seem like a tall order, the good news is that enterprise tests don’t need to produce piles of data to be useful, says Joel Snyder, principal with Opus One in Tucson, Ariz. For example, a VPN test only needs to focus on two sets of numbers: performance using your typical mix of packet sizes and traffic types, and performance under a “worst-case” scenario, with peak traffic loads.

A method to the madness

Running a test that will get you meaningful results starts with creating a sound test methodology. Before devising a methodology, talk to peers within and outside your organization about how the product will be used and what features are most important, lab alliance members say.

Product vendors are another good source of methodology information. “More than once, feedback from vendor engineers has stopped us from doing something really stupid,” Newman says, noting that any vendor’s attempt to spin a test in its favor is typically transparent. He also says the IETF Benchmarking Methodology working group is a good source for methodologies and the Cooperative Association for Internet Data Analysis for measurement, performance monitoring, workload and other tools.


Equipment essentials

Who’s who in the Lab Alliance

Testing taboos


Ed Mier, founder of the Miercom testing firm in Princeton Junction, N.J., says vendors “provide excellent insight, intelligence and feedback about what to look for in the particular product class,” often by pointing out a competitor’s weaknesses.

ISPs and carriers also can prove to be valuable resources when developing methodologies, says Thomas Henderson, managing director of ExtremeLabs in Indianapolis.

Setting up the lab

With methodology in hand, the next step is to create a lab environment that mimics what the product will experience in your production network. The goal is to create a high-density, high-capacity environment, Newman says. “You want the test to be bigger, stronger and faster than whatever it is you’re testing,” he says.

In some cases, you might even want to use the production network when running tests, says Jeffrey Fritz, director of enterprise network services at the University of California, San Francisco. “We set up devices in the lab and work with them for a while, until we know they’re fairly safe, then connect the lab network to the production network,” says Fritz, who tests high-end switches for Network World.

Christine Perey, president of Perey Research & Consulting in Placerville, Calif., follows much the same tack when testing collaboration tools. In her lab, she has gear that might be found in a branch office. For the enterprise view, she leverages connections with large companies or academic institutions.

Henderson uses an ISP’s network operations center for certain tests, a strategy he said enterprise users might likewise be able to employ. “In some cases, they’re very interested in test outcomes,” Henderson says, citing a wireless equipment test he conducted last year. Universities also might be willing to play ball on tests with user organizations, he notes. “Some universities have a diverse infrastructure that mimics those in industry, meaning they didn’t buy all the same equipment on the same day,” he says.

If you want to conduct the test on your own, you likely will find it tougher to acquire necessary equipment than do lab alliance members. Vendors of test equipment such as traffic generators will offer up their gear free of charge for a mention in a published review, but you won’t likely have that luxury (unless, of course, you want to partner with Network World and agree to disclose your test results, in which case Lab Alliance Director Christine Burns would be happy to talk to you). On the other hand, a number of free open source tools are available for tasks such as testing proxy caches and capturing network traffic (see “Equipment essentials“).

To acquire other equipment for your test lab, such as servers, switches and routers, several lab alliance members recommend eBay and network hardware.com. “We buy almost everything we can on eBay,” Snyder says. The key is to start early. “If you’re patient, you can always get a great deal.” He cites the four Extreme Networks Summit 48 switches he bought for $500 to $600 apiece, much less than the “Buy it now” price of $900 to $1,200.

Quality results

When it comes to running the actual tests, advice from lab alliance members is as varied as the types of products they test. But when asked how many times they generally run a test to ensure accurate results, their answers were surprisingly consistent: Three times is the charm.

Mier likes to test three times or use three different testers. Henderson runs at least three iterations of performance tests to ensure results are consistent and will scrap a test entirely if he can’t get the results to fall within a 5% margin of error in terms of consistency.

“We do everything we can at least three times in a row and hopefully repeat all tests separated by a day and in a different order. If the tests agree, then at least you know you’ve got repeatability,” Snyder says. He points out that this does not necessarily mean you’ve learned anything about the product. Accomplishing that gets back to paying attention to the methodology to ensure you’re considering how the product will be used in the production network. “Firewalls, or any security product, are excellent examples. The test gear is totally clean, and the test is repeatable. The real world doesn’t behave that way.”

When Snyder tested anti-spam products several months ago, he used an actual feed of e-mail traffic rather than the “canned” spam several vendors wanted, which consisted of older, well-known spam. “The benefits of using the feed we did outweighed the lack of repeatability,” he says. “That’s the only time I can remember saying a non-repeatable test was acceptable.”

Unlike the number of times you should run a test, lab alliance members had varied opinions on the length of time a test should run, again reflecting their specialties. Router performance can be measured in as little as 30 to 60 seconds, Newman says, although tests of services, such as for ISP backbones, run “in the wild” for at least 30 days. Perey runs tests on multimedia equipment for at least eight hours. Thomas Powell, founder of San Diego Web development firm PINT, says his tests of Web site management and security products usually run a day or two. However, he says problems typically crop up early.

Snyder begged off the question, but he did have a tip on getting through late-night sessions: “We have adequate stocks of Jack Daniels, a pair of 950 watt amplifiers, four studio monitor speakers and a five-disk CD player.”

Desmond is president of PDEdit ( www.pdedit.com ), an IT publishing firm in Framingham, Mass.