Friday, February 6, 2009

Cognitive Errors: The Availability Heuristic and You

Cognitive errors is an aggregate term I use for biases in cognition, biases in memory, and fallacious logic. Cognitive errors are almost always interesting, and often illuminating. It's amazing how often they are incorporated into our everyday understanding of the world, from subtly affecting our opinions to blatantly causing them.

Some of these cognitive slip-ups can be harmless, some have horrible consequences, but most are somewhere in between. There are two main reasons for us to consider them, when discussing ethics (besides their presence just making stories more interesting). Cognitive errors affect how we act; if we are attempting an action in order to help others, we want to make sure what we are doing is actually the most helpful action. This aspect is similar to the law of unintended consequences: we really want to make sure what we are doing actually is a positive thing. The other main, albeit slightly overlapping, reason to consider these errors is to temper our judgments of others: the effects and intentions of another person's actions can be opposites.

Our first cognitive error is the availability heuristic.

The availability heuristic is basically the error we make when we assume that because something is more mentally available it is more likely. What make's something available, you might ask. Two major factors, which also influence each other, are the amount of attention that is naturally drawn to something (it's salience) and the amount we are exposed to it. Let's compare car and plane crashes. Car crashes are far more likely to kill us (since we are more likely to be in a fatal accident), but many people think of flying as more dangerous. An airplane crash is more salient to us than a car crash. When worrying about a car crash, we may think of it as a quick accident, whereas when we think of an airplane crash, we may be drawn into thinking about falling out of control with people panicking around us, and the longer time-frame of such a crash. Perhaps partially because of this, plane crashes generally get more focus in the news than car accidents, thus further increasing its availability. Since most of the time we don't logically compute the differences in risks, we are often easily fooled by availability of information.

There are many ways this relates to living a good life. If you wish to make a difference in the world, you may well be fooled into over-concentrating effort in one direction when you could do far more in another. A (fairly) recent example would be the tsunami in southern Asia. There was an outpouring of support to help provide disaster relief. The tsunami was covered widely in the media and was, apparently, a particularly salient event. However, other less salient and covered problems like those occurring in Darfur or Sudan which needed more help did not get it. The availability heuristic helps us fail when we try to help. (Luckily, relief groups like Doctors Without Borders limited the amount they would spend in one area to try to divert more help where it is needed.)

It also relates directly to how we decide what to do. Our naive understanding of outcomes is directly influenced by the availability heuristic as well. Some salient negative effects of a policy or decision can obscure much greater, but less obvious, positive effects.

It's always something to try to think around when making decisions or trying to understand the world.

No comments:

Post a Comment