Studying the Failures

In 1943, the American statistician Abraham Wald was asked to advise the US air force on how to reinforce their planes. Only a limited weight of armor plating was feasible, and the proposal on the table was to reinforce the wings, the center of the fuselage, and the tail. Why? Because bombers were returning from missions riddled with bullet holes in those areas.

Wald explained that this would be a mistake. What the air force had discovered was that when planes were hit in the wings, tail or central fuselage, they made it home. Where, asked Wald, were the planes that had been hit in other areas? They never returned. Wald suggested reinforcing the planes wherever the surviving planes had been unscathed instead.

It’s natural to look at life’s winners – often they become winners in the first place because they’re interesting to look at. There is an important lesson here. If we don’t look at life’s losers too, we may end up putting our time, money, attention or even armor plating in entirely the wrong place.

How does this apply to technology?

We usually learn more from failures than we do from success.

Sometimes the failures are fatal, and it’s important then for the survivors to learn from them. A number of early skydivers died because two of the handles on their parachutes were of similar shape, and some spent the rest of their lives pulling on the wrong handle because they pulled without looking. The same thing happened to some pilots, when important cockpit levers were of similar shape, and right next to each other. In 1983, a Boeing 767 ran out of fuel mid-flight because a fuel crew confused pounds with kilograms. The Hubble Space Telescope had a similar metric/imperial units problem with it’s main mirror.

While it’s always nice to pay attention what we’re doing, in the real world we don’t always do that. Designers learned “Idiot-Proofing.” While few of us are actually idiots, sometimes idiotic things happen when we’re under pressure or not paying attention. If you don’t believe me, next time you’re on the highway, count the tailgaters!

Idiot-proofing in technology is usually a good thing. That’s why a Windows computer will ask you, “Are you sure you want to send this file to the recycle bin?” Windows 8 does not do that with the factory settings (Probably to reduce user annoyance), but I think it’s a very bad idea to not have such a warning.

The problem with warning messages is that most users see enough of them that they start to ignore them, like the villagers in the “Boy who cried wolf” story. Some programs go much further. For instance, if I want to delete a mailing list or blog post, I have to actually type the word “DELETE” to do it. I like that. It makes it almost impossible to delete something valuable with a mistaken mouse click.

Some crises occur when the parties are speaking two different languages and don’t know it. The pounds/kilograms issue with the 767 is a classic example of not being on the same page.

Some disasters are due to a failure of imagination. If nobody imagines that an X-ray machine operator will ignore a cryptic warning message and fry his patient with massive radiation, it will probably happen. Nobody imagined that airliners could be turned into flying bombs with a fanatic at the controls… And we found out how wrong we were on September 11, 2001.

The problem is we are always preparing for past disasters to an extent. We try to prevent another hijacking when the enemy is already way past that in their planning. The only way to avert (most) disasters is to realize how fallible (and sometimes, evil) human nature really is, and imagine ways to keep such things from happening in a worst-case scenario. But, sadly, sometimes people have to die before a flaw is discovered.

For more spectacular failures, see:

For even more empowering technology info, read my new book, “Deciphering the 21st Century,” Available now!

Click here to read all about it.

Follow me on Twitter:

I’d love to hear your comments!