Skip Links

Results of Cisco vs. Aruba and Meru in 802.11n access point performance test

By Brad Reese on Sat, 03/08/08 - 2:43am.

Aruba Networks Test

Aruba Networks released the results of a test of its products vs. those from Cisco and Meru, and this is a summary of those results:

Only Aruba infrastructure consistently yielded greater than 100 Mbps throughput, roughly 5 times the performance of earlier generation 802.11a/b/g compatible wireless systems, in single and multiple client tests using both PC and Apple Macintosh laptop clients.
Aruba’s AP-125 802.11n Access Point delivered the highest performance - >160 Mbps for a single client and >150 Mbps for multiple clients – and equitably shared the channel with multiple clients.
Cisco’s lightweight AP-1252 was capable of >125 Mbps but not with all clients. The mixed-client testing showed remarkable uniformity among clients but with lower aggregate throughput.
Meru’s AP-320 delivered poor performance with all but one of the clients, and exhibited inequitable channel sharing when used with multiple clients.

Products Tested:

Although all of the APs were dual-band 2.4/5 GHz devices, the 5 GHz mode was used for the purpose of these tests in order to realize the maximum benefits of 802.11n.

All of the APs used a 3x3 MIMO antenna configuration. Additional pertinent set-up details are shown below.

ArubaOS (MODEL: Aruba6000-US), Version
AP Model: AP125
Mobility Controller Model: MMC-6000 w/ M3
Built-in antennas

Cisco IOS
Product/Model Number: AIR-AP1252AG-A-K9
System Software Version: 12.4(10b)JA
AP Model: AP1252AG
External patch antenna - AIR-ANT-5140V-R

System Software Filename:
AP Model: AIR-LAP-1252AG
WLAN Controller Model: WLC4402-12
External patch antenna - AIR-ANT-5140V-R

System Software Filename: 3.4SR3-112
AP Model: AP320
WLAN Controller Model: MC3100
External antennas

Test Network Topology

Aruba Test Network

The WLAN infrastructure from all four vendors was deployed in accordance with published specifications.

The latest released software/firmware was loaded on all equipment, antennas were oriented vertically for optimum transmission/reception and all clients were located on a 1 meter high table.

All testing was conducted indoors on a clean channel with no other nearby interfering APs.

Test Results: Aggregate Throughput

802.11n TCP Throughput

Test Results: Single Client and Mixed Client Aggregate Results

Single Client and Mixed Client

Key Findings: Single Client Tests

The primary purpose of the single client throughput tests was to determine which client laptop had the highest amount of performance/data throughput for each of the 3 WLAN infrastructure vendors.


Throughput for the single-client tests was consistently >125 Mbps for all client-types.
Top performance of 169 Mbps was observed with the MacBook Pro (Atheros chipset).
Mac OS clients exhibited higher throughput than Windows XP / Vista clients regardless of vendor.

Cisco IOS:

Demonstrated good compatibility with all clients during single client tests, but with lower than expected throughput of 85 Mbps for all client types.

Cisco LWAPP:

Observed throughput was greater than the Cisco IOS AP.
Single client throughput >100 Mbps for both MacBooks and 3x3 Intel clients.
Best performance observed with MacBook Atheros client.


Very poor throughput with HP/Compaq (3.1 Mbps) and Apple Macbook Air (2.0 Mbps) clients using Broadcom chipsets.
Best performance with Atheros (>135 Mbps), lower throughput with Intel chipsets.

Key Findings: Mixed Client Tests


Good ability to scale as demonstrated by aggregate throughput in the mixed-client tests almost equaling the average throughput of individual clients.
Fair distribution of air-time and throughput across all clients and client combinations in the mixed-client tests.

Cisco IOS:

Poor performance in mix-client tests that include clients with Broadcom chipsets.
Total throughput for mixed-client tests was half of the individual client numbers.

Cisco LWAPP:

Multiple-client split was relatively uniform but total throughput was lower than expected.


Aggregate throughput for mixed-client tests was about half of the individual client tests, suggesting scalability problems in mixed client environments.
One or two clients dominate airtime and thereby starve other clients.

Aruba Test Conclusion:

The type of Wi-Fi Alliance 802.11n Draft 2.0 certified client used in a network should not affect network performance, and indeed that was the case for Aruba’s 802.11n infrastructure.

In stark contrast, however, client type and client mix profoundly affected the throughput performance for other WLAN infrastructure vendors.

The issue is not the clients, since they exhibited high throughput on the Aruba WLAN, but instead a design or implementation problem in the other WLAN infrastructure.

The Aruba WLAN exhibited aggregate, multiple client throughput that approached the maximum throughput observed for any individual client.

The Aruba WLAN also demonstrated airtime fairness across all clients which resulted in very even throughput distribution across all clients.

Neither Cisco nor Meru delivered consistently high throughput or universal airtime fairness.

The Cisco APs exhibited poor throughput performance that was typically <100 Mbps.

Meru delivered high throughput only with the Apple/Atheros client, and caused client starvation and/or poor throughput for all other clients.

There is every reason to believe that end users will experience similar client performance, or lack thereof, in actual WLAN deployments.

The best means to avoid trouble is for integrators and/or end users to run tests with a representative set of clients that are expected to be used in the final deployment.

Only in so doing are they likely to observe potential client starvation and throughput problems.

The tests demonstrate that Aruba’s 802.11n solution has much to offer enterprise WLAN users in terms of 802.11n AP throughput, scalability, multi-client environment performance, and client airtime fairness across a diverse range of clients – all key criteria for a successful 802.11n WLAN deployment.

View the entire 17 page Test Document.

Do YOU agree with Aruba's conclusion of their test results?

Did Aruba skewer the test results or perhaps only test areas where Aruba knew they would get good grades?

Were there any particular aspects that Aruba failed to test - why is that?

Is it possible that Aruba did test those elements but didn't get good results and that's why they're not in the results?

Hopefully, readers will help Network World look beyond what's written in the test results, give us your feedback!

View 55 more Cisco vs. Competitor Lab Test Results.

Contact Brad Reese


Brad's Top 5 Story Picks
Links to all Cisco Subnet's ASR 1000 router blog postings here
Royal Canadian Mounties bust swindlers seeking to sell $2M in fake Cisco gear
Fonality PBXtra 4.0 pulverizes pricing of the new Cisco HAAS business model
CertSearch Tool - rates Cisco and Microsoft practice test providers
NetFlows Next Step: Network Behavioral Analysis

Cisco Power Supplies

Cisco Authorized Factory Refurbished List Pricing

Cisco Repair and Hardware Troubleshooting


On The Web