Review: Voice over Wireless LAN

Aruba is the top dog if you want to add voice traffic to an enterprise wireless LAN

Aruba is the top dog if you want to add voice traffic to an enterprise wireless LAN.

VoIP should be an easy fit for wireless LANs, but mixing the two technologies today is difficult. Despite VoIP's low-bandwidth profile, even a small amount of data traffic on the same network can lead to seriously degraded audio quality and dropped calls, even with QoS features enabled.


Wireless architectures remain diverse

QoS enforcement: What happened?

Product features of the systems tested (Excel)

How we did it

Archive of Network World reviews

Subscribe to the Product Review newsletter

That's the major conclusion of our first-ever assessment of VoIP capability in WLAN systems. Over the course of three months we tested WLAN switches and access points from Aruba Wireless Networks, Chantry Networks (now Siemens), Cisco and Colubris Networks in terms of audio quality, QoS enforcement, roaming capabilities, and system features. Other vendors, including Airespace, Meru Networks and Trapeze Networks, declined to participate (see "How we did it").

Among our major findings:

•  With QoS enforcement enabled, the products delivered near-toll-quality audio, provided only voice traffic is active. This is fine as long as the wireless network carries voice traffic only, but that's not likely as companies move toward converged voice-data networks.

•  When voice traffic had to contend for bandwidth (even with a little data traffic), dropped calls were common and audio quality on the remaining calls was poor in many cases - and this was with QoS enforcement enabled.

•  With data traffic present, roaming from one access point to another took anywhere from 0.5 to 10 seconds - in cases where roaming succeeded at all. These long delays and dropped calls made roaming practically impossible with some vendors' gear.

While some products struggled mightily in our tests, Aruba's A2400 and A800 switches and A61 access points were consistently strong performers. The Aruba products posted generally excellent numbers, regardless of how much voice or data traffic was thrown at them. Aruba's gear just worked, earning it the Clear Choice Award.

Two issues confounded other vendors. First, when handling voice and data traffic on the same network, vendors need to pay attention to metrics such as delay and jitter rather than forwarding rates.

Many vendors are only just beginning to tune their products for voice/data convergence, even though some have touted that capability for 18 months or more. However, it's still relatively early days for VoIP over WLANs. Test tools that accurately measure these metrics on WLANs (such as the VeriWave instruments we used) are only just beginning to appear, and this test is among the first to measure audio quality, delay and jitter in a methodical way.

Second, the emerging 802.11e standard for QoS on WLANs might bring some relief. The 802.11e specification wasn't yet ratified when we began this project, so by definition all QoS methods were nonstandard. Companies might want to wait until the new 802.11e specification and products based on it are more mature and fully tested.

Voice quality with and without QoS

Measuring voice quality over wireless

Our tests sought to answer a simple question: How does a VoIP over WLAN system sound?

To find out, we worked with VeriWave, a start-up that makes WLAN test and measurement equipment. VeriWave developed a new application, the VoIP over WLAN Analysis Test Suite, especially for our test.

In addition to collecting delay and jitter statistics, VeriWave's test suite and TestPoint hardware let us measure R-value, an ITU specification (G.107) for determining call quality. R-value is an objective measurement, computed directly from measurements of packet loss, jitter and delay. While R-value is objective, it has a strong correlation to the subjective Mean opinion score method in ITU standard P.80 (see R-value ratings).

R-value ratings

An ITU specification that determines call quality, R-value measures packet loss, jitter and delay.
R-valueUser satisfactionUser satisfaction
90 or higher4.34 or higherAll users very satisfied
80 or higher4.03 or higherAll users satisfied
70 or higher3.60 or higherSome users dissatisfied
60 or higher3.10 or higherMany users dissatisfied
50 or higher2.58 or higherNearly all users dissatisfied

We measured voice call quality with up to 14 handsets and an H.323 call server from SpectraLink, a maker of 802.11 handsets. We measured audio quality with up to seven concurrent calls, and in some events configured the VeriWave TestPoint boxes to offer background data. For each system tested, we checked call quality with QoS disabled, then enabled.

Audio quality without QoS

With QoS disabled, we started by routing all calls through one access point. Because all the vendors recommend enabling QoS for voice traffic, this baseline test gave us a "before" picture to demonstrate the need for voice traffic prioritization.

With QoS turned off, all four systems tested did fine with only a single call active, with R-values hovering around 78. That is about as good as it gets with VoIP over wireless. The threshold for near-toll-quality voice is generally considered to be around 75, meaning the systems delivered good audio quality for a single call.

Performance for all systems changed across the board when we placed six or seven calls through a single access point and switch, especially when data traffic was active. Yet even without background data, we could not test the Colubris system with seven calls active and QoS disabled - all the calls dropped.

When we configured the TestPoints to offer background data (a stream of User Datagram Protocol [UDP] packets at 1M bit/sec), the results were positively awful without QoS. With only six concurrent calls, R-values for all systems (except Aruba) were generally at or below the point where voice signals were unintelligible or calls were dropped. Sound quality through Aruba's system remained high, roughly the same as with no data, even without QoS.

The Chantry and Colubris systems could not perform the background data test with seven calls (QoS disabled). All calls failed as soon as the VeriWave box began offering background data.

All vendors recommend the use of QoS mechanisms for handling voice traffic, even when no data traffic exists. QoS is a must when handling VoIP traffic over a WLAN.

Adding QoS to the mix

We reran the same five test configurations as in the non-QoS cases: We measured one call with no background data, and six and seven calls with and without the 1M bit/sec background UDP traffic.

We expected much-improved results once we enabled QoS, but only Aruba's system put up consistently excellent results in all the tests with QoS enforcement. Even in the most stressful case (seven calls plus background data), the Aruba system delivered near-toll-quality. With QoS enabled on the Aruba equipment, there was little difference between the least and most stressful test scenarios.

Other vendors' QoS mechanisms did little to protect call quality when background data was present. On the plus side, QoS mechanisms generally did an excellent job when only voice traffic was present.

Audio quality improved for all systems in cases where we used only voice traffic. In tests with six and seven calls (no background data), all systems delivered near-toll-quality results with QoS enabled.

That changed when we added the background data. With six calls and data active, R-values fell below 70 for the Colubris CN1250, meaning that "some users [would be] dissatisfied" according to the ITU R-value specification. The R-value was about 60 for the Chantry switch ("many users dissatisfied").

Beyond the objective R-value scores, we did some subjective spot-checking of call quality when data was present. Sure enough, we heard echoes, dropouts and generally poor voice quality whenever the TestPoints offered a datastream.

Things got worse for Chantry, Cisco and Colubris when we tried seven calls plus data. Chantry's BeaconMaster couldn't handle this test case; all seven calls failed when we added data. Cisco's WLSM posted an R-value of about 50, the bare minimum level at which calls are intelligible. Further, three of seven calls dropped during this test on Cisco's gear. The Colubris CN1250 completed the test, but didn't forward enough voice frames for the test equipment to compute an R-value score. R-value scores for this test were only computed for the calls that remained active during the 30-second test (so in Cisco's case, it was on four calls instead of seven).

SpectraLink generally recommends a maximum of six concurrent calls per access point, not the seven we used in our tests. Thus, vendors might complain that our seven-call scenario was an overload test case. That is valid, but only up to a point. First, the Chantry and Colubris systems had trouble even with the recommended maximum of six calls with data. Second, Aruba's system could handle the seven calls with data scenario. Third, our most stressful test came nowhere near overloading the wireless medium. We offered 3M bit/sec of traffic or less in all tests, including voice and data. That's not even near the amount needed to saturate the wireless channel (see story on wireless architecture remaining diverse).

It's possible to run each access point with seven calls and data, provided the system is designed for it. But doing so requires careful attention to timing (see story).

Delay and jitter measurements

Delay and jitter are critical metrics for any application, but are especially important when dealing with voice or video. When delay or jitter rises to 50 to 70 millisec, voice quality starts to degrade (see graphic). With six calls and background data, the average delay measured below 50 millisec for all vendors, but maximum delay and jitter shot up to much higher levels, topping out at more than 250 millisec in tests of Cisco (six calls) and Colubris gear (seven calls).

Delay and jitter with QoS

An analysis of the logs produced by the TestPoints found several reasons for the voice quality degradation. Anytime jitter exceeds 60 millisec, audio quality begins to suffer. As the maximum delay and jitter numbers rose, R-values fell - and that's for the calls that survived the 30-second test. When delay and jitter rose too high, the calls simply dropped.

Doubling the access points

So would throwing more access points at the problem help? We re-ran all our tests on two access points, with half the phones associated to each access point.

Voice quality through two access pointsWith two access points, the R-values were generally much higher. That wasn't surprising, considering each access point does half the work as in the first set of tests. Average delay also increased, which was expected given the additional component in the traffic path.

While this suggests performance can improve with more access points, it also raises several concerns. Cost goes up, even with thin access points. Second, wireless spectrum is limited, and depending on placement, too many access points will interfere with one another. Third, performance still was not perfect with two access points; we had some dropped calls in the presence of background data.

With a mobile workforce you can't predict how many users will try to associate with a given access point at a given time. Every access point has a saturation point, and our results suggest that point is relatively low when voice is added.

Roam if you dare

Mobility for voice is a major driver for a WLAN deployment. Just as cellular phone users move from one coverage area to another, so too will WLAN handset users.

We measured the time needed for a call to migrate from one access point to another, with both access points attached to the same switch. We also tracked R-value, delay and jitter.

To force the handsets to roam, we powered off the first access point. This drew objections from two vendors - Chantry and Colubris. Chantry says its roaming capabilities are designed for the case where a user physically moves from one location to another, not when there's a power loss to a given access point.

While it is desirable to test roaming under this condition, we could not put enough space between access points in our 1,200-square-foot lab for this approach to be practical. We considered using the VeriWave TestPoint as a noise generator, but rejected that option because it was no more representative of physical mobility than the power-off test. Also, loss of power is a real (if uncommon) occurrence; if the access point goes away for whatever reason, a WLAN system needs to seamlessly migrate associated users to a nearby alternative.

The Colubris CN1250 could not be tested by turning it off. The vendor handles mobility through Mobile IP, which requires a home agent - the station where a client first learns its IP credentials - to remain active. If the access point that hosts the home agent goes down, so does the ability to roam. Cisco also supports Mobile IP, but did not use that technology in our tests.

Instead of pulling the plug on the CN1250, we tested roaming by disabling the radio on the first access point. This had the same effect of forcing the clients to roam.

Colubris also requires a third access point to function as a foreign agent, which relays information about clients that roamed back to the home agent. For this purpose, we used a third Colubris CN1250 with its antennas removed; there is no requirement that the foreign agent needs wireless connectivity.

1 2 Page 1
Page 1 of 2
The 10 most powerful companies in enterprise networking 2022