Denial and Error

Errors have become something of a bad thing, but they need not be that way. Conceptually, an error should be a minor mistake or misjudgement, a simple slight slip-up, but usually hopefully nothing too serious in most cases.

But this is not the world errors live in, because they live in our world, and in our world, errors become something much more grave. Our world is the world of the human, and if you think about it, all errors really boil down to human error at some point. What should be treated as a common wrinkle to be casually flattened out is instead treated as a glaring issue, something alarming which someone needs to be alerted to. Yellow warning signs, boxes erupting from the screen to rub noses in errors, red squiggly underlines pointing out mistaken homophones and finger slips. It gets to the point where errors in the world of humans start to look an awful lot like getting a paper back from a particularly anal high school English teacher. Is it any wonder people are fearful of computers when all they do is evoke tremors of high school nightmares?

This depraved treatment of errors in software should come as no surprise to anyone familiar with current software development. Those who write software are forced to write it for an unforgiving computer, and they are tasked with the grueling edict to coerce every decision into a zero or a one, a yes or a no, a right or a wrong. Is it any wonder the software itself is reflexive of the computer it runs on?

Computer program writers are not the sole source of humanity’s maltreatment of errors, but they are amongst its most vicious of perpetrators, possibly due to a hypersensitivity to the likelihood of errors. A software developer knows the likelihood of errors is high, and an error is a commonplace and usually simple issue, and yet ironically so little seems to be done to actually fix the errors when they happen. Instead, the focus is on preventing the errors, which must be a fool’s errand, becase as we all know, a sufficient number of errors happen nonetheless.

Attempting to prevent errors is natural, but specious logic. Attempting to prevent errors seems natural, because from a very young age, we’re taught errors are bad. Errors are not something that should be corrected, but instead it is the making of errors that should be corrected — we’re taught we shouldn’t make them in the first place, when what we really should be taught is how to learn from them when they happen. Parents tell their children not to cry over spilled milk, but making a mess is a cause for aggravation. A teacher tells students everyone is smart in their own way, and yet those who aren’t smart at passing contrived tests feel bad at their errors. Preventing errors seems natural to us because we’ve had the fear of them driven into us, not because there’s actually anything inherently bad about them.

As Ed Catmul of Pixar said:

The notion that you’re trying to control the process and prevent error screws things up. We all know the saying it’s better to ask for forgiveness than permission. And everyone knows that, but I Think there is a corollary: if everyone is trying to prevent error, it screws things up. It’s better to fix problems than to prevent them. And the natural tendency for managers is to try and prevent error and over plan things.

Software developers are notorious time wasters when it comes to attempting to prevent errors. They’ll spend weeks trying to make the software perfect, provably perfect, all in the name of avoiding errors. They’ll throw and catch exceptions in a weak attempt of playing keepaway with an error and the user, but inevitably all balls get dropped. They’ll craft programming interfaces so flexible, the framework can reach around and scratch the backs of its own hand (this is called recursion). Abstract superclasses, Class factories, lightweight objects (whatever the hell those are), all in the name of some kind of misplaced mathematical purity never reached in the admittedly begrimed world of software development. These dances of the fingers ultimately come down to attempts at preventing errors in the system itself, but they too are in folly, because in the future, one of two things will happen:

  1. The system will change, but the developers couldn’t have predicted in which ways, so all the preparations for preventing this implicit error were incorrect, and need to be fixed anyway.
  2. The system will not change, and so all the preparations were in vain.

At first, it seems software developers treat software as though it were still groves and dots punched into pieces of paper, shipped off to be fed into the mouth of a husky mainframe in another country. Immutable, unmalleable and unchangeable program code, doomed to prevent only the errors its developers could predict. But at least punch cards are flexible. Instead, it seems more like the program code has been chiseled into stone. That’s it. You prevent some errors and punt the rest of them off to the user, to make them feel bad about it.

We deny these errors. We deny them and pass them off to different systems, computer or person. We treat errors as something shameful to deal with and something shameful to have caused. But errors are no big deal. Errors should be expected and be inherent in the design. From a debugging level, errors should be expected and presented to all levels of a development team so that they can always track them down quickly. From an organizational level, errors should be seen as a chance to infer new information about the organization’s strengths and weaknesses. From a user perspective, errors are a chance to explore something off the beaten path.

Errors allow for spontaneity and for exploration. Errors allow for that angular square you went to school with to loosen up and meet some new curves. Errors in DNA created you and me. How can software change if we embrace, instead of deny, errors in our systems?

Join the Discussion 👂🤔✍️

Please read the Discussion Guidelines before replying.

☑️ Email me when someone replies.

Speed of Light