# Law of Averages Overturned

**by Tony Kontzer**

All those business school lessons about using historical averages to predict probable outcomes? Apparently, that may have been precisely the wrong approach.

In his new book, The Flaw of Averages (2009, John Wiley & Sons, Inc.), Sam Savage, a consulting professor of management science and engineering at Stanford University, posits the theory that averages are a hopelessly useless statistic for determining risk, and that it's time for business decisions to be made using more accurate predictors.

One possible answer, Savage claims, is the emerging field of "probability management," which draws upon a type of data Savage has helped define known as "distribution strings." Unlike numbers, distribution strings are based on the unknown, making it possible to take the uncertain into account when determining a probability. So instead of a spreadsheet cell containing one number, it contains a distribution string of thousands of numbers--called Monte Carlo trials--that constantly are updated any time another cell in the spreadsheet is changed. The idea is that risk isn't a fixed number--it's an ever-changing collection of averages that are inter-related.

"The flaw of averages happens when people plug a single number into a cell to represent a probability," Savage said during a recent interview. "Think of taking a spreadsheet and adding a third dimension to it. Any cell in your spreadsheet should be able to provide you with an average."

It's a big concept to get one's brain around, so to illustrate the problem consider this crude example of how Savage believes averages doom many IT projects: Take a software development project in which 10 separate teams are each working on a particular sub-routine, with no interdependence between them at all. The project manager isn't sure how long each sub-routine will take, but he knows the average will be 3 months, so he relays that to the boss when pressed.

Unfortunately, according to Savage, there is only one chance in a thousand that the project will be done in 3 months--the same odds as flipping a quarter and have it come up heads 10 straight times. In the end, the boss is unhappy, the project manager is held responsible, and stress levels for the next development project rise. Ultimately, companies find themselves resigned to accept that most software projects will take longer than expected, when in fact the problem is that the practice of using historical averages to compute probable completion dates is fundamentally flawed.

The way Savage see it, this dependence on averages has had some pretty grave consequences, which the recession highlights. "We ignored this, and we flew into the side of a cliff," says Savage. He expects that risk analysis will be handled quite differently going forward. "There will be a tremendous move, after this recent meltdown, of people trying to understand risk and uncertainty better."

For IT, in addition to rethinking how project timelines are estimated, this is likely to mean the need to deploy and support tools that can handle this ramped up focus on risk and uncertainty. New applications will be needed for processing distribution string data, and then for analyzing that data to determine probabilities.

In the meantime, if you want to give probability management a test drive, you have two options. You can purchase Frontline Systems' RiskSolver application, which is the brainchild of company President Dan Fylstra, who was the original distributor of VisiCalc back when the Apple II made its debut in 1979. Or, you can try downloading Savage's less powerful--and less expensive--XLSim.

And you may want to brush up on your knowledge of probability management, distribution strings and Monte Carlo trials--something tells me your ability to sufficiently analyze risk and uncertainty will soon depend on it.

Related: Scott Rosenberg on the difficulties of software development.