Mind Traps That Aren’t Deadly, But Are Disruptive

Psychology Today has an interesting article called “Deadly Mind Traps: Simple cognitive errors can have disastrous consequences—unless you know how to watch out for them” that describes a number of, well, cognitive errors that have caused people to make mistakes that lead to their doom.

Although we’re not really working with literal doom on a daily basis in software development (although, increasingly, the errors we introduce, uncover, and/or do not discover in software do lead to literal doom), the same mental errors can lead to problems within software development projects and within the quality assurance within them.

So let’s look at the traps listed in Psychology Today:

Redlining

Hall [a mountaineer who summitted Mount Everest but died on the descent] fell victim to a simple but insidious cognitive error common to many types of high-pressure undertakings. I call it “redlining.” Anytime we plan a mission that requires us to set a safety parameter, there’s a risk that in the heat of the moment we’ll be tempted to overstep it. Divers see an interesting wreck or coral formation just beyond the maximum limit of their dive tables. Airplane pilots descend through clouds to their minimum safe altitude, fail to see the runway, and decide to go just a little bit lower.

You see a lot of this sort of behavior on timelines, where close to milestones or the end of cycles, someone introduces changes or whatnot. Sometimes you see this when someone wants to use some unproven, misunderstood piece of technology to solve a problem, but the learning curve proves too steep to get things done on time, or someone has to kludge a solution to meet the need at the last minute.

You know how to fix this? Understand and adhere to limits.

The Domino Effect

Similar tragedies play out time and again when people try to rescue companions. A teen jumps from a dangerous waterfall and disappears; his buddies follow, one after the other, until they all drown. A firefighter goes into a burning building to rescue a comrade; another goes in after him, then another.

In each case, the domino effect results from a deep-seated emotion: the need to help others. Altruism offers an evolutionary advantage but can compel us to throw our lives away for little purpose. “In stressful situations, you see a failure in the working memory, which is involved in inhibiting impulses,” says Sian Beilock, a psychology professor at the University of Chicago. “People lose the ability to think about the long-term consequences of their actions.”

You see this in software development in many cases: When something’s going badly, the management throws more people at it, and they do something badly, only less efficiently (see also Parkinson’s Law). Or a company chases features that other companies put into their software or chase some small bit that competitors or the industry press follows.

You see this, too, when too much effort is put into one channel of testing, where one expends a lot of effort in automated testing and continues pursuing that goal to the exclusion of other, more relevant exploratory testing or whatnot.

To keep from being another domino to fall, the article warns you not to leap instinctively into the manure machinery (truly, that’s what it suggests), but that you take a step back and consider alternative actions if you see cascading failure before you. The saying attributed to Einstein, that insanity is doing the same thing over and over again and expecting different results, applies. So does your mother’s “If everyone else jumped off the bridge, would you jump off the bridge, too?”

When you see something fail, don’t try it again. Try it again, differently.

Situational Blindness

As GPS units and satellite navigation apps have flourished over the past few years, there’s been a spate of similar cases, in which travelers follow their devices blindly and wind up getting badly lost. In each case, the underlying mistake is not merely technological but perceptual: the failure to remain aware of one’s environment, what aviation psychologists call situational awareness, or SA. People have always had difficulties maintaining SA, psychologists say, but the proliferation of electronics, and our blind faith that it will keep us safe, has led to an epidemic of absentmindedness.

“A big element in SA is paying attention to cues,” says Jason Kring, president of The Society for Human Performance in Extreme Environments. “If you’re focusing just on that GPS unit, and you see that little icon moving down the road, and say to yourself, OK, I know where I am, technically, that can be a big problem, because you’re not looking at the world passing by your windshield.”

Full situational awareness requires incorporating outside information into a model of your environment, and using that model to predict how the situation might change.

You can lose situational awareness in many ways in software development and testing. You can narrow your focus only to the problems in your niche of the process, your set of features, or what have you. You focus on the trees, and you can’t see the Christmas tree lot.

So to combat this, you need to pay attention to higher level considerations. You need to know as much as you can about the industry, the customers, and your organization as you can, and to factor as much as you can into your decisions as to how to proceed. If you know what your customers, that is, your users, really need and really do over the course of the day, you can weight your testing to cover those needs more than the features your business analyst forced onto the software because your customer’s VP, hired from outside the industry, wanted them.

Double or nothing

The runaway-balloon problem is a manifestation of our irrational assessment of risks and rewards. As Daniel Kahneman and Amos Tversky first pointed out back in 1979, we tend to avoid risk when contemplating potential gains but seek risk to avoid losses. For instance, if you offer people a choice between a certain loss of $1,000 and a 50-50 chance of losing $2,500, the majority will opt for the riskier option, to avoid a definite financial hit. From the perspective of someone dangling 20 feet in the air, the gamble that they might be able to ride the gondola safely back down to the ground seems preferable to a guaranteed pair of broken legs. But in the moment, they can’t factor in the price they’ll pay if they lose.

We see this in software development when someone takes a flier because he or she thinks he or she has nothing to lose. A developer doesn’t like the way the code works or encounters some difficulty with it, so he spends all weekend rewriting several weeks worth of several developers’ work before the Monday client meeting instead of just putting a little rouge on the hog and acting from the insight gleaned at the client meeting. Or a company rushes headlong into a move to a new technology or new cloud strategy against the reluctance of its clients. And so on.

To combat this risky frame of mind, you’ve got to, again, establish limits and procedures to make sure that people know when they can take risks on their own–and when they can’t.

Bending the map

Such errors of overconfidence are due to a phenomenon psychologists call confirmation bias. “When trying to solve a problem or troubleshoot a problem, we get fixated on a specific option or hypothesis,” explains Kring, “and ignore contradictory evidence and other information that could help us make a better decision.”

Confirmation bias is very common in our field. Technological fanboys think that their favored language or framework is the best, and that all evidence points in that direction. The whole sales, marketing, and company magazine teams live just to provide this confirmation bias about how well your organization is doing and how everything is going swimmingly.

To combat confirmation bias, you’ve got to learn to distrust yourself. You’re not that smart. The way you think something is going isn’t the way it is going. Well, maybe it is. You’re not even lucky enough to get it wrong 100% of the time. When you encounter a new piece of evidence or information, don’t just try to fit it into your gestalt or dismiss it. You’d better figure out why that little strange thing happened, and what it can mean for your whole endeavor.

(Link seen in Reader’s Digest, where the things are given in a different order, perhaps to better track Internet content thieves. And, yes, I do read Reader’s Digest. I am an old man.)

Comments are closed.


wordpress visitors