"You can fool all of the people some of the time, and some of the people all of the time.But you can't fool all of the people all of the time."- Abe Lincoln"You can fool some of the people all of the time, and those are the ones to do business with."- The occasional storage vendorWithin the last two weeks, I have seen some impressive performance claims from a number of vendors. In some cases the numbers were well documented and showed the vendor had run the tests on a level playing field, which is to say an industry-standard testing environment. In other instances, it was apparent that some vendors subscribe to the philosophy that "storage makes you stupid" - they had created their own environment for testing, succeeded masterfully within it, and were taking the numbers public. After all, if the numbers were good, who cares how they got them?The answer of course is that you do.These days there is less and less excuse for vendors to create optimally tuned test beds in which to run their equipment against non-standard tests. To be sure, in some situations generalized tests will be less useful than testing against specific configurations running specific applications or handling specific loads. But in a generalized situation, testing against a standard is best.One place to go is the Standards Performance Evaluation Corporation (SPEC), an independent benchmarking organization that provides a number of standardized, objective benchmarks. SPEC comes out of the engineering design and high-performance computing world, and it offers many tests that are clearly aimed at technical sites. It also, however, offers a set of storage file-system tests that are appropriate for evaluating commercial systems.These tests and their results are all available for review by the public. For example, SPEC SFS97_R1 (sometimes appearing as "SPECsfs" or "SFS 3.0") is the latest version of a test suite that benchmarks NFS file servers. If you want to see which boxes are performing well in the SPECsfs tests, go to the SPEC SFS97_R1 Results page, where the tests are listed on a quarter-by-quarter basis.Pick a set of tests to review. The first thing you will notice is the throughput number, the number of operations per second. Don't just look at the numbers however. Scroll down the results page to see each configuration that was being run. Then make sure you look at the notes at the bottom. These will sometimes be extensive, and will give you the complete view of what was being tested, thus helping you avoid comparing the results of a low-end device with more expensive machines.A nice touch is that unlike some other benchmarks, acquiring these tests is not very costly. In fact, the price is so low that even the smallest startup can afford to get in the game. For example, SPEC SFS97_R1, the NAS test, only costs a vendor $900 to buy. Perhaps the best news is that the results are available - free - to everybody to anyone who care to wander over to the SPEC Web site.During most of my career, I worked for vendors, and I frequently heard the term "specsmanship" used to describe the way data could be influenced to make a product appear to be better than it really was. The level playing field of industry-standard benchmarks will go a long way towards making the term less prevalent, and will get IT users a step closer to more informed decision-making.