Nate Silver is my favorite statistician precisely because he understands the fundamental importance of uncertainty:
The Weather Service has struggled over the years with how much to let the public in on what it doesn’t exactly know. In April 1997, Grand Forks, N.D., was threatened by the flooding Red River, which bisects the city. Snowfall had been especially heavy in the Great Plains that winter, and the service, anticipating runoff as the snow melted, predicted that the Red would crest to 49 feet, close to the record. Because the levees in Grand Forks were built to handle a flood of 52 feet, a small miss in the forecast could prove catastrophic. The margin of error on the Weather Service’s forecast — based on how well its flood forecasts had done in the past — implied about a 35 percent chance of the levees’ being topped.
The waters, in fact, crested to 54 feet. It was well within the forecast’s margin of error, but enough to overcome the levees and spill more than two miles into the city. Cleanup costs ran into the billions of dollars, and more than 75 percent of the city’s homes were damaged or destroyed. Unlike a hurricane or an earthquake, the Grand Forks flood may have been preventable. The city’s flood walls could have been reinforced using sandbags. It might also have been possible to divert the overflow into depopulated areas. But the Weather Service had explicitly avoided communicating the uncertainty in its forecast to the public, emphasizing only the 49-foot prediction. The forecasters later told researchers that they were afraid the public might lose confidence in the forecast if they had conveyed any uncertainty.
Since then, the National Weather Service has come to recognize the importance of communicating the uncertainty in its forecasts as completely as possible. “Uncertainty is the fundamental component of weather prediction,” said Max Mayfield, an Air Force veteran who ran the National Hurricane Center when Katrina hit. “No forecast is complete without some description of that uncertainty.” Under Mayfield’s guidance, the National Hurricane Center began to pay much more attention to how it presents its forecasts. Instead of just showing a single track line for a hurricane’s predicted path, their charts prominently feature a cone of uncertainty, which many in the business call “the cone of chaos.”
Silver also understands how uncertainty can be undermined:
Unfortunately, this cautious message can be undercut by private-sector forecasters. Catering to the demands of viewers can mean intentionally running the risk of making forecasts less accurate. For many years, the Weather Channel avoided forecasting an exact 50 percent chance of rain, which might seem wishy-washy to consumers. Instead, it rounded up to 60 or down to 40. In what may be the worst-kept secret in the business, numerous commercial weather forecasts are also biased toward forecasting more precipitation than will actually occur. (In the business, this is known as the wet bias.) For years, when the Weather Channel said there was a 20 percent chance of rain, it actually rained only about 5 percent of the time.
The same is true in the discipline of economics, where generations of modernist practitioners have attempted to domesticate and contain uncertainty (in the form of probabilistic certainty) instead of allowing it to flourish (and dealing with the resulting effects on their concepts and methods).
As Silver explains, “It’s much easier to hawk overconfidence, no matter if it’s any good.”