The Butterfly Bug
Edward Lorenz died this past Thursday at age 90. This is as good an excuse as I will ever have to recount the story of his discovery (which I first learned from the book Chaos by James Gleick). It is a tale of math and the proper use of computers.
Edward Lorenz had majored in math, but after working as an Air Corp weather forecaster during WWII he found himself working as a meteorologist. In 1961 he was one of the first to seriously apply computer modeling to the prediction of weather. He ran simulations which took hours to calculate, and printed their output every so often. One day he wanted to continue an interrupted run and to save time didn't want to start the simulation over again from the beginning, so he started it by typing in the numbers from one of the printouts near the end of the previous run.
He didn't start from the FINAL printout because, of course, there was a danger of typing in a value incorrectly, and he wanted to check against the next couple of printouts in case he'd made a typo. Curiously, the numbers on the subsequent printouts were close, but not exactly right! A normal person would have shrugged their shoulders and ignored it, but Lorenz investigated further.
After some experimentation, he discovered that the discrepancy was due to rounding: he was typing in 0.506 when the computer was storing 0.506127. The result was that the output diverged, slowly at first but eventually giving a completely different result. Here is a sample from his output:
At this point a normal person would have shrugged and just gone back to start from the beginning again, but Lorenz realized there was something deeply wrong about this. The weather measurements that he was working from were not accurate to 6 decimal places — if accuracy at this level mattered, then perhaps all the research he was doing was meaningless! Further investigation showed that he could reproduce the problem, and that it wasn't due to a error in his program: if there was a bug it lay, not in his program, but in the mathematics behind it!
He tried simplifying his equations, seeking to discover the simplest case that still exhibited the behavior (this is a standard debugging technique used by programmers everywhere). Eventually he got a simple system of just 3 equations that exhibited this property, dubbed "Sensitive Dependence on Initial Conditions". Under certain circumstances (which apply to weather predictions), tiny variations in input values will grow over time until they completely alter the results of the formula. This discovery was popularized as the "butterfly effect": a butterfly choosing whether or not to flap its wings could alter weather patterns a few months hence half the world away. This is the reason that computers have gotten hundreds of times faster and bigger, yet weather forecasters still only predict about a week in advance.
Of course, it goes far beyond the weather. Lorenz's discovery proved to be at the root of an entire field of mathematics (now called "Chaos Theory") which encompassed many oddities from strange attractors to fractals.
There are several lessons in this story. You have to be willing to question your fundamental assumptions: a tiny difference in the last decimal point turned out to contain a discovery which undermined his entire field of study... and opened up a new one. There's also a lesson about computers: they can be very helpful, but too much trust can lead you astray. Invalid input, invalid assumptions, invalid equations, odd mathematical chaos... there are many pitfalls. And finally, an experiment that fails may in the end be even more useful than one that succeeds.