We noticed some erratic Address Resolution Protocol (ARP) behavior in tests involving 2,480 users when Blue Coat forwarded either Web or SSL traffic (see "What about SSL?"). Although Blue Coat replicated our issue in-house and produced a software fix (now available to customers), we still observed sluggish behavior in the 2,480-user tests after applying the update.
Silver Peak's NX appliances were third-fastest, tripling transaction and data rates and reducing response time by around 2.5 times when handling 248 users. With 2,480 users, performance dipped slightly (by about the same margin as Blue Coat's appliances), though traffic still moved substantially faster than in our no-device baseline test. Silver Peak says these results are roughly in line with its in-house testing.
Cisco's WAE appliances better than doubled performance with 248 users, and more than tripled performance with 2,480 users. Cisco's WAE devices don't proxy Web traffic as they do with CIFS, so the performance improvements here are largely attributable to TCP optimizations.
QoS testing revealed some of the most interesting -- and in some ways most problematic -- results of all our performance testing. While three of four products did a virtually perfect job of prioritizing traffic, the path there was anything but straightforward, involving much tuning -- and in some cases external devices to protect key flows during congestion.
To measure QoS capabilities, we offered a small amount of high-priority traffic -- in this case, a single VoIP call, which is sensitive to delay and jitter -- while walloping the WAN with huge amounts of background traffic. We used User Datagram Protocol (UDP) for both high- and low-priority flows; VoIP uses UDP by default, and TCP was not suitable as background traffic, because of its built-in congestion control.
Blue Coat's SG appliances couldn't participate in this test, because they don't optimize UDP traffic. The other vendors turned in excellent results but used different paths to get there.
Cisco recommends using WAN routers (in this case, the Cisco 3845 and ISR 2800 Series devices it supplied) rather than application accelerators for shaping traffic. Cisco's WAAS-acceleration devices and routers work together using network-based application recognition (NBAR). We verified in testing that flows the acceleration devices classified using NBAR will be prioritized by the routers during congestion. The routers turned in great results; the ClearSight analyzer measured R-value, an audio-quality metric, as 92.03 out of a possible 93, and they correctly re-marked DSCPs.
Note that ultimately Cisco's entry performed prioritization on its routers, not on the application-acceleration devices, though the latter did play a role in classifying traffic. This differs from the Riverbed and Silver Peak devices, which performed prioritization on board. Many network managers already run QoS on WAN routers, and for them handing off this function to a router isn't a big deal. For users just getting started with QoS, it may be simpler to set it up on application-acceleration devices, and leave routers alone, at least for now.
The Riverbed and Silver Peak appliances also protected voice traffic, with R-value scores of 91.80 and 90.07, respectively, and both correctly re-marked DSCPs.
Of the two, the Silver Peak NX appliances were easier to configure. They correctly classified VoIP streams and shaped traffic according to the parameters we defined. Riverbed's Steelhead appliances don't classify real-time protocol streams automatically, and a bug in the software version we tested wouldn't let us manually define port ranges. Instead, we used other criteria, such as source address, to classify VoIP streams.
Our final performance test determined the maximum number of TCP connections each system could optimize. This is an important metric for enterprises with many remote offices and hub-and-spoke network designs, where connection counts for data-center devices can run into the tens of thousands. All the devices we tested get into that tens-of-thousands range, but there was more than a fourfold difference between the highest and lowest capacities.
To measure connection concurrency, we configured Spirent's Avalanche to issue a Web request once a minute, letting us establish many connections and keep them alive. We kept adding connections until transactions began to fail or the devices stopped optimizing new connections.
Cisco's new WAE-7371 came out tops in this test, accelerating more than 50,000 TCP connections ( see graphic Maximum accelerated TCP connections). Silver Peak's NX appliances were next, optimizing 43,306 concurrent connections. This is well short of the NX 7500's rated capacity of 128,000 optimized connections, a level that Silver Peak achieved in internal testing. We were unable to reproduce that result in our lab, and, despite extensive troubleshooting, neither we nor Silver Peak's engineers were able to explain the difference. The Blue Coat SG appliances were next, handling about 19,500 optimized connections.
Riverbed's Steelhead 5520 optimized more than 12,200 connections, but that result reflects the limits of the two Steelhead 3520 units through which we set up connections. Riverbed says the higher-end 5520 model can optimize 15,000 connections. We were unable to confirm that result, but our tests did show that each 3520 slightly outperformed its rated limit of 6,000 connections to get to the 12,200 total mentioned previously.
Features and functions
Most testing focused on performance, but we also assessed devices for functionality, manageability and usability. Each of these areas turned up at least as many differences as the performance tests did.
All acceleration devices reduce the number of bits sent across the WAN, but they do this in very different ways. The Blue Coat and Cisco devices act as proxies, terminating connections between clients and servers and setting up new sessions on their behalf. Riverbed's devices can proxy traffic, though the vendor did not enable that feature for this test. Silver Peak's NX appliances don't proxy traffic.
Transparency is another architectural difference. Blue Coat and Silver Peak engineers respectively configured SSL or generic routing-encapsulated tunnels between appliances, and Riverbed can use SSL tunneling. Tunneling may pose a problem if other inline devices, such as firewalls or bandwidth managers, need to inspect traffic.
Cisco claims this is a major differentiator for its WAAS offering, which doesn't hide traffic from other devices and automatically learns about new traffic types from other Cisco devices using NBAR.A powerful classification engine, NBAR in our tests classified even applications using ephemeral port numbers, such as for such as those used for H.323 and Session Initiation Protocol. Silver Peak's appliances also classified such traffic. Then again, transparency isn't an issue for users who don't need application visibility among acceleration devices.
Application support also varies, but it's less important a differentiator than performance, manageability and usability. It's tempting -- but also a bit misleading -- to compare the number of predefined application types each vendor claims to optimize. First, the applications involved are important only if they're running in your enterprise. Second, acceleration devices still may boost performance even if a given application isn't predefined, thanks to compression and TCP optimization. Finally, all devices we tested allow manual definition of new application classes based on addresses and port numbers (though these may not be subject to the same speedups as some predefined types).
To look after all the devices in our test bed's enterprise, we asked each vendor to supply a central management system.
We assessed centralized management in terms of functions and reporting features. On the functions side, all vendors but Blue Coat offer a centralized method of pushing out configuration changes or software upgrades to all appliances. Blue Coat indeed can push changes and upgrades but only by manually defining a job to push out the change. All vendors allow appliances to be defined into groups (though Blue Coat's Director appliance requires a manually defined job to perform an action on a given group).
All devices use a dashboard display to show application distribution and volume during predefined periods. These displays can be enormously helpful in managing application traffic even before acceleration is enabled. It's pretty common to find during installation that enterprises are running applications they didn't know about.
Once acceleration is enabled, these devices use pie charts and bar graphs to report on compression, percentage of optimized vs. pass-through traffic and data reduction.
The Cisco, Riverbed and Silver Peak appliances aggregate displays across multiple devices, a useful feature for capacity planning. There were differences in terms of the application data and time periods supported; for example, Silver Peak's display was useful in troubleshooting because -- uniquely among the products tested -- it reported on packet loss and did so in per-minute intervals.
There are significant usability differences among the accelerators, but we'll be the first to admit this is a highly subjective area. If we had to rank the systems in terms of ease of use, the lineup would be Riverbed, Silver Peak, Cisco and Blue Coat.
Riverbed's Steelhead appliances came closest to the goal of "just working." Setup took less than half a day. Once we were up and running, we found the user interface to be simple and well designed. It was easy to make changes and view reports, even without delving into the company's well-written documentation.
Silver Peak's NX appliances also feature a simple user interface with excellent reporting on current and historical statistics. The central management display wasn't as polished or fully featured as Riverbed's, although unlike Riverbed's, it includes a topology map of all appliances.
Cisco's display bristles with features and commands -- perhaps too many. Cisco's redesigned dashboard offers whizzy graphics, useful pie charts on CIFS application performance and (like Riverbed and Silver Peak devices) real-time connection monitoring and per-device reporting on connection statistics. Getting to specific commands or opening logs often took more steps than with other devices, however; further, not all the commands available from the device command line were available from the GUI, and vice versa.
Blue Coat's management software, while powerful, was the most difficult to use. Individual appliances used a Web-based Java application that was sluggish; further, it worked with Internet Explorer but not Firefox. And some predefined tasks in other vendors' devices, such as updating configuration or images, required manual definition in the Blue Coat devices, or touching each appliance individually.
Newman is president of Network Test, an independent test lab in Westlake Village, Calif. He can be reached at firstname.lastname@example.org.
Newman is also a member of the Network World Lab Alliance, a cooperative of the premier reviewers in the network industry, each bringing to bear years of practical experience on every review. For more Lab Alliance information, including what it takes to become a member, go to www.networkworld.com/alliance.
Learn more about this topic
Buyer's Guide: Application-acceleration management
Podcast: How to test WAN-optimization gearVista over the WAN: good but not great
06/18/07Dual WAN router testing: Good and bad about load balancing
Microsoft removes and depreciates features in its Windows 10 Creators Update that apply to commercial...
Developers of the popular LastPass password manager rushed to push out a fix to solve a serious...
A review of 18 companies that offer free cloud storage
Sponsored by Aquantia
Sponsored by Brocade
Steven Bay, a former defense contractor, knows a thing or two about insider threats. For a brief...
Many people are familiar with biometrics as a security measure at airports or police stations, where...
The internet has your number—among many other deets. Prevent identity theft and doxxing by erasing...
Underpaid? Unchallenged? Unhappy with the culture at work? It might be time to look for a new job