≡ Menu

A rock becomes a bear, and a bear becomes a rock

Black bear, AlaskaLinda Lieberman is a U.S. National Park Service ranger with years of experience interpreting nature for National Park visitors. This Summer, we met her in Glacier Bay National Park, Alaska, and she led us on a glacier cruise where we saw lots of wildlife, including a grizzly bear and her cubs, foraging on the rocky beaches for mussels and other delicacies.

Linda told us that she’s learned from park visitors over the years that the rock you are staring at can become a bear, and perhaps just as often, the bear you’ve spotted becomes a rock.

Humans are superbly evolved for pattern recognition, but we mix into that skill a tendency to see what we expect to see, and sometimes what we want to see. When my family and I are park tourists, we want to see bears—we have years of experience with rocks becoming bears and bears becoming rocks.
Jerome Groopman is a physician and writer who has made a study of how doctors make diagnoses and decisions, exploring his own clinical experience and the luterature of heuristics. Heuristics is a complicated word for how we figure things out, including the patterns and "rules of thumb" we use.
In his book How Doctors Think (2007) Groopman examined our pattern mis-recognition tendencies in the context of doctors making diagnoses. In their frenetic world, they have to make quick diagnoses, and they often get it wrong.
Here are the main ways Groopman says doctors get it wrong. I’ve paralleled what Groopman observes with the parallel errors we can make in exploring the future:

Representativeness—Letting what is most typically true influence what you see and don’t see, e.g. the symptoms commonly associated with a problem will always be there

In futures:

  • Assuming a different culture, group, organization, or person will do what we would in the same circumstances
  • Missing discontinuities-clues to potential sudden shifts and breaks in patterns hidden from us by what seem like the much clearer evidence of familiar patterns
  • Assuming a change you’ve observed is representative of what’s going on, because it’s part of your life or experience, e.g. “everybody is composting household food waste” 

Availability—Being influenced by the patients recently seen, e.g. the last five cases had the flu, so this one must also

In futures:

  • Assuming that history repeats itself
  • Using a narrow worldview to interpret the unknown—“people won’t like that”
  • “We tried that already”—something I think is similar to this idea didn’t work before, so it won’t work this time
  • Straight line trend extrapolation—the trend will continue because of the past record, with do sharp breaks, changes of directions, and so on

Affective error—Making decisions based on what the doctor wishes is true, affected often by an emotional sense, e.g. the patient reminds me of me, the patient is a young healthy guy, and can’t be all that si 

In futures:

  • The thing I know and care about (change in my lifestyle, e.g.) is typical and represents the bigger picture future
  • Wishful thinking, everything will work out ok
  • Our product/technology is the future—because we love it, it must be the best fit for the situation and the best choice for the future
Jerome Groopman's How Doctors Think
Doctors, at least in the U.S. clinical settings, and more legitimately on the field of battle and emergency rooms anywhere, have to make decisions very, very quickly. People engaged in foresight have, and ought to take, the luxury of time to really think about things, catch themselves in these thinking errors, and, if possible, get closer to the truth. I wrote about our speed of decisionmaking in a closely parallel post, Don’t blink.
In exploring the future, we’re prone to each of these sources of error. Perhaps no one’s life is immediately at stake, but the cost of error can still be high, especially in missed opportunities, and in mis-timing an action that counts on a change in the marketplace.
We need to keep reminding ourselves of our human capacity for misinterpreting what we’re seeing, assuming continuity, finding what we’re looking for, and, especially, for wishful thinking.
print
{ 2 comments… add one }
  • Adam Gordon September 9, 2008, 5:11 am

    Yes indeed. Once again a bull’s-eye, thanks. Methinks foresight work has lots more work to do in this area (or bringing in work done in other fields). So far futures thinkers have mostly left it at the nebulous level of “paradigm” or “assumptions.” But in fact the entire spread of perceptual-framing factors (biases) are relevant in getting to better futures thinking. I address a list of these in “Future Savvy,” [Amacom Press, 2008] in the context of how to assess and evaluate forecasts. See http://www.futuresavvy.net Btw, the Jerome Groopman (and others?) struggles with medical perception/diagnosis has been a regular theme in The New Yorker – at least that’s where I’ve been following it. -Adam

Leave a Comment