≡ Menu

13 mistakes you make when exploring the future

I have been exploring the future, and helping others explore the future, since the late 1980s. I have long made a direct, though un-scholarly study of the dos and don’ts of foresight, and I like to share insights from that in this blog. I thought it was time to offer a one-stop place to see the key thoughts on the pitfalls of foresight, so I offer, below, an “unlucky 13” mistakes you can make when exploring the future. If you’d like to explore this more deeply, below the unlucky 13, I offer a few links on cognitive bias—the basis behind many of these mistakes, and a science in itself. And please let me know what you think.

1. Presentism—interpreting the future or future possibilities with present-day attitudes and values. Imagine fifty or a hundred years ago, a view of 2010. Could an African-American be president? Certainly not. Back then people would be sure that most Americans would not support that. So a view of today, from back then, probably wouldn’t include an African American president. In fact, any one change happens along with all sorts of other change. As we think about a particular change, we have to remember that values and attitudes, and lots of other conditions in the future will also be different. Related post.

2. Over-optimism—Assuming that change will happen faster than it likely can. Usually this means a technological change that will, in fact, be held up by social forces, but the reverse is also possible. Personal or organizational interest in the expected change often drives over-optimism. You like it, so you assume everyone will like it, and it will happen. Technology enthusiasts are especially prone to this. They may look forward to a big technology advance and know why it is technically possible. They are less likely to remember that we have to be ready as a society for the change too. Related post.

3. Over-pessimism—Assuming change will happen slowly or never. Our sense of “that will never happen” is rooted in experience—it is often hard to make a desired change. So we may assume that change won’t happen, or will take a long time to happen. But there are breakthroughs in technology and society all the time. We get slow change, but it is punctuated by times of sudden, or rapid change. We can’t let our pessimism and past experience blind us to the possibilities that things could happen quickly.

4. Extremism—Mistaking a visionary idea for likely. An example is the Singularity, the assumption of a quantum change in society because of a rapid coming together of the effects of technology. Extreme and sudden change is rare though not impossible. But even the big waves of change we’ve experienced, World War II, the Sixties, the Internet, the Post-9-11 world, have unfolded in a society with mainly continuity. In other words, even significant change is part of the evolution of society. And just as the Spring Paris fashions finally arrive at Macy’s in a muted form (e.g. somewhat bigger shoulder pads, rather than enormous shoulder pads), we should recognize the visions of what is possible will usually bring us less-pronounced change.

5. Underinterpretation. It’s easy to come up short in assessing what a change might mean. Usually, there is a conventional wisdom, or an obvious implication of a change. With a technological change, it is what the innovators first develop the technology to do. What’s so much more interesting, and often has much more impact, is what other things we use the innovation for. For example: the mobile phone was not invented as an alternative for cash, but in more and more places, it has become the means of payment for shoppers. While we watched the spread of cellular telephones, we thought we were seeing a revolution in personal communication: people talking to people. But it has meant much more than that.

6. Superlativitis. Use of always, and worse, never. It’s tempting to decide something is impossible—it will never happen. It’s also tempting to decide something is eternal—it will always be true. Those absolute words are often clues to an overreach. I have a favorite quote from one of my favorite writers: “No one is ever going to sit down and read a novel on a twitchy little screen, ever.” –E. Annie Proulx (1994). I can highly recommend Annie’s fiction writing, but I cannot recommend her futures thinking. Though she said this in the mid-1990s, well before the arrival of the Kindle, the Nook, and the iPad, it was clear even then that the nature of reading, of the novel, and so on were changing, and that we were likely to arrive at new ways of enjoying literature. Happily for ebook readers, despite her misgivings, Annie Proulx’s work is available in ebook format.

7. Looking for an answer—Expecting specific predictions about the future. It would be great if we could simply predict the future, but we can’t. At best we can narrow the range of things we have to consider possible, identify a set of alternate possibilities. We can even identify what parts of the unfolding future we might shape for outcomes we want. But people crave answers, and the temptation is powerful to draw overly-specific conclusions about the future or look for an overly-specific answer.

8. The tyranny of sunk investment—We have trouble, especially inside organizations, in seeing past the current systems we worked hard to put in place and pay for. It’s tough to let go of the thing you’ve put so much money or effort into. While often this is literally a sunk investment of money, it can also be a psychic investment—something you’ve given so much effort and attention or “love” to, it’s hard to let go. While we cling to such things, the world tends to change around us. This post offers some thoughts that relate to this sort of bias.

9. Parochialism—We instinctively use our own point of view to understand others. We live our lives inside our country, our culture, and our community. We bring one person’s view to most things we do, and it’s hard even to take the point of view of our own friend, child, or spouse, let alone understand someone on the other side of the world. But more and more things make it valuable or essential to try to understand the perspective of people with far different lives. See related posts here and here.

10. Force fitting to categories. Exploring change, such as through environmental scanning requires you to establish some “buckets” – some topics or categories which you can use to collect information that suggest important forces and trends. A good practice is to use a set of open, broad categories such as STEEP—society, technology, economy, environment, politics. However, what we tend to do is discover a few things of a more specific nature, and then continue to find things that reinforce those. For example, when you have “discovered” the importance of nanotechnology, you are may decide to collect things on that topic, and you are sure to spot more about it as you read and research, and add to the category. It will gain in apparent importance, sometimes at the expense of another topic. You will see what you are set up to see, and could miss spotting other information and insights. Related post

11. Accepting the given framing: Peanut butter goes with jelly, and the way those two are linked makes it hard for us to think of peanut butter and something else. The way something is framed and understood can block our thinking about it in a new way. Particular stakeholders on an issue are the most vocal and most published and they set the issue and discussion agendas in how they write and talk about a topic. Their framing of the topic can become the way everyone thinks and talks about it. The most politically-clever know how to set and shape the issues agenda on purpose to control the discussion. It’s particularly important when this is happening to step back from the discussion as it is being framed and argued, and look at the bigger picture. That may mean totally reframing how you look at it. That’s where 1). breakthrough thoughts are likely to come from, and 2).  you can move the discussion off the partisan and politicized basis. Related posts here on framing, here on the emotional aspects of framing, and here suggesting that we need to allow ourselves to separate things in our thinking that have long been connected: food and season, sex from reproduction, etc.

12. Single-trend thinking—can include jumping on a fad bandwagon, or being too interested in a specific force or trend in society. Our society and our world are complexes of forces, trends, shaping factors, players, etc. No change has a single cause. No one trend defines our future. To understand what’s happening and get a sense of the directions of change, you have to look at broad range of things, and avoid looking at everything through the single perspective of a big factor or trend. For example, in too many businesses, a commodity or product price is the factor that drowns out other factors that ultimately may be more important.

13. Asking the wrong question. It is common to look at change from a routine point of view, asking how the usual factors are changing. This is a way of monitoring conditions, and regularly ask “how are we doing”. We all do it. A company that asks “how can we grow our market share in the next three quarters?” Is asking the routine question they probably always ask. Their question assumes that conditions remain the same, no great change unfolds in the game space in which they play. But what if there’s a big change, or challenge to the core of the business? A disruptive technology? Imagine the owners of Tower Records planning their next few years of strategy, ignoring the new player in the music game—download music sales and iTunes. Tower Records went bankrupt in 2006. Related post 

Lest you think I am claiming to have this all figured out, please understand that I know about each of these pitfalls because I have made these mistakes too. I continue to try to learn about what mistakes can happen and why, and welcome your thoughts on this.

There is much to be learned from a couple of rich areas of inquiry: heuristics and cognitive bias.

Cognitive bias

Most of the pitfalls I’ve described are simple foresight versions of cognitive biases. For a little more on that, I recommend, for some fun with a purpose, a high school teacher created this cognitive bias song for his AP Psychology students. The biases he explains in the song are nearly all potential pitfalls in the work of exploring the future. So, despite the outward silliness of this little ditty, you can learn from it.

Then, in a much more somber way, the Royal Society of Account Planning catalogs cognitive biases is in this slideshow. It summarizes critical cognitive biases that affect interpretation and decisionmaking. They offer a number of parallels to the pitfalls described above.

Finally, a book by Jerome Groopman, How Doctors Think (2007) taught me some clear lessons, from the context of how doctors assess the patients in front of them, that fit closely to our experiences in interpreting the future. An earlier post here, A rock becomes a bear and a bear becomes a rock, the post here explains a little on how Groopman’s thoughts on medical heuristics relate to foresight. 

Broken glasses image: functoruser, via Flickr, cc license

Print Friendly, PDF & Email