In the previous post I wrote about the difficulty – quite common among networking and IT engineers – of expressing concepts that we understand in quantifiable terms that have meaning to the CFO or anyone else who must fund our projects.

The problem is that we engineers work in meterable, precise quantities. If you are asked about the bandwidth utilization of a certain link or the number of packets dropped at a certain interface over a given time period, you can go get an exact number to answer the question. But when we cannot come up with a precise number for something, we tend to get tripped up. Sometimes we dismiss as immeasurable – *intangible* – anything to which we cannot assign a firm numeric value.

CFOs, on the other hand, regularly work with estimates and projections and so are more comfortable with them than we are. They make business decisions by analyzing the data available to them and then determining the probabilities that a decision will give them the results they want.

There’s a disconnect because engineers and business people think differently about information and use it differently. So when you, as an engineer or CTO, propose a new networking project and say it will reduce risk in the network, and the CFO asks “by how much?”, you might be tempted to dismiss or avoid the question because, you think, you can’t put a concrete number to “risk” with absolute certainty. Risk appears to be an intangible.

The CFO, on the other hand, wants to hear an answer such as, “We are 85% certain that this project will reduce the network risk by 30%, resulting in an annual savings of $175,000 over the next five years.”

Again, you’re used to measuring something and deriving a definite number in which you are 100% confident. You can’t put a definite number around “risk reduction.”

Douglas Hubbard, in his wonderful little book *How to Measure Anything: Finding the Value of Intangibles in Business*, tells about a question he asks of his seminar attendees: How would you measure the number of fish in a lake?

Someone usually suggests draining the lake. If they are efficient, they might be sure to put each fish in a truck as it is counted to avoid double-counting, and perhaps multiple sweeps of the drained lakebed to insure that no fish was missed.

If such a project were to be executed, it would yield an almost exact fish count. The problem is that the project is self-defeating: In the end all the fish are dead and the lake is gone.

A marine biologist would have a different approach. He might suggest catching 1,000 fish, tagging them, and releasing them. After giving the fish time to mingle back in to the general fish population, he again catches 1,000 fish and counts how many have been tagged. Let’s say 25 of the 1,000 re-caught fish have tags: That’s 2.5% of the re-caught fish. He knows there are 1,000 tagged fish in the lake, so those 1,000 fish represent 2.5% of the total fish population. Therefore there are approximately 40,000 fish in the lake.

The point Hubbard is making with that anecdote is that people often confuse *measure* with *count*. Where a count gives you an exact number with 100% confidence in its accuracy, *a measurement is a reduction in uncertainty*.

The marine biologist knows that there are not exactly 40,000 fish in the lake, but his measurement is close enough to the actual count for his purposes. A carpenter might measure a length of molding to be cut at 5 feet, 11/32 inches; he knows that the measurement is not exact down to the micrometer or even millimeter range, but it’s close enough to make a good joint.

There are usually points at which continued reduction of uncertainty no longer adds significant value and can even reduce the value of the information. The carpenter could buy expensive scientific instrumentation that would make his measurements accurate down to the micrometer range, but the improvement to the fit of his molding cuts would be barely noticeable over what he can get with his measuring tape. Draining the lake to get an exact fish count is an enormously expensive project with unacceptable ecological impacts.

Turning back to networking projects:

- How do we measure things that we might think of as intangibles, such as risk and security?

- How do we decide when a measure is “close enough?”

- How do we determine the value of the information we are deriving from the measurement?

In the next post we will look at some of the strategies for turning concepts into things that can be measured.

A quick note:I’ve been thinking for some time about doing a series on QoS. But while I’ve been thinking about it, Dennis Hartmann has been actually doing it. If you haven’t already seen his series, have a look. Great coverage of a complicated and misunderstood topic!