• United States

Testing against oneself – the component wars

Mar 01, 20043 mins

Anyone who’s followed The Tolly Group for more than a few months knows that IT vendors often ask us to conduct competitive tests. However, late last year we received a request we’d not received previously. A technology provider asked us to run a test, made public last December, of Linksys vs. Linksys.

Strange? Was it Linksys wanting to show that its new gear is better than its old gear? No. The test was sponsored by a vendor of the network processor that went into one of the Linksys boxes – but not the other.

So who won? Well, Linksys, of course. The tests showed that the Linksys box containing the sponsor’s “core” achieved almost 95M bit/sec of throughput, where the Linksys box containing “Brand X” hovered just above 20M bit/sec.

More and more, box vendors pick and choose hardware and software components from multiple vendors. In our example, boxes that retail for roughly the same price, from the same vendor, have a throughput ceiling that differs by a factor of four.

As a network manager, I breathe a sigh – and not a sigh of relief. I sigh because I realize I can’t just pick a brand and stick with it. I can’t just expect that the newer model will perform better than the older one. With the possibility of the raw components for each successive model being sourced differently, I don’t know what to expect.

Bill of materials (BOM) cost differences of a few dollars can determine which components go into the next generation of gear. After all, many of these items, such as access points and broadband routers, will be manufactured in huge volumes. Then, even small BOM cost differences add up.

Being unable to rely only on brand name, what is a network manager to do? Even if one had the time or interest in knowing the subtleties of whose network processors and software stacks were used to build each device, finding that out isn’t easy.

I don’t remember seeing a single datasheet (yet) that referenced the underlying network processor, let alone the various software stacks involved.

For our tests, the only way we could know for sure which network processor was being used was to pry open the box (which often voids whatever warranty existed) and scan the markings on the chips. Finding out the genesis of the software components is often difficult, if not impossible.

And don’t expect the box vendors to help. In the industry, it is a well-known fact that most vendors of low-end gear build little or none of it themselves (there are exceptions). Vendors, though, have no interest in users getting the impression that their product line is nothing but a mish-mash of components of varying quality from a frequently changing list of vendors.

End users like to feel that a given box brand gives them consistency. Our “Linksys vs. Linksys” testing illustrates that, even with leading brands, that is not the case.

So long as users ignore this situation, the box vendors will be happy. It is interesting, though, to see that while the box vendors do very little competitive testing, their component vendors are determined that stark performance differences become visible to the public. If not directly to the end user, then indirectly to the box vendors.