- Content Hub
- Leadership and Management
- Decision Making
- Decision Making Essentials
- Superforecasting: The Art & Science of Prediction
Access the essential membership for Modern Managers
Transcript
Welcome to the latest episode of Book Insights from Mind Tools. I'm Frank Bonacquisti.
In today's podcast, lasting around 15 minutes, we're looking at "Superforecasting: The Art & Science of Prediction," by Philip Tetlock and Dan Gardner.
Forecasting is an everyday part of life – at home, at work and in society at large. In business, we try to predict market trends, client behavior or our competitors' next steps. In politics, governments try to predict the impact of their policies. And in our personal lives, we try to forecast house prices, the success of a future marriage, or how long it'll take to drive across town.
Companies make forecasts about us, too. Amazon and Google predict what we might want to buy or read, based on our shopping habits or web-browsing history. Then there's the weather. If forecasters didn't advise us when to take an umbrella, we'd get wet a lot more often.
Some predictions involve incredibly high stakes. Good forecasting in business could make the difference between financial success and bankruptcy. In foreign policy, it could be the deciding factor between war and peace. At home, bad forecasting could put our savings at risk, while a sound prediction could lead to a windfall. And in meteorology, accurate forecasting can help people prepare for typhoons and tornados, while poor forecasts can put lives in danger.
So since forecasting is so vital and the stakes are so high, is there a way we can get better at it? Can we improve our ability to predict the future, so we come up with the right strategy, launch our product at the right time, buy or sell the fastest-moving shares, and make better decisions at a political and a personal level?
This book says we can. "Superforecasting" explores what it takes to make more accurate predictions and become better decision makers. It draws on decades of research and on a large forecasting competition funded by the U.S. government to find out how top forecasters make their predictions. It shows how we can learn from them and apply their methods in our own lives.
So who's this book for? "Superforecasting" is for anyone who wants to get better at predicting the future – so that's a large audience. We think those involved in politics or international affairs, and leaders of national and multinational businesses, will get the most from this book. But it's relevant to anyone who has an interest in forecasting and the decision-making process. It's also for anyone who has an interest in politics, history or global affairs, and the decisions that have shaped our world.
But you do need to be comfortable with math, statistics and probability to enjoy this book. There are lots of numbers and acronyms to digest, as well as some graphs and charts. That said, the authors do their best to make what can be complex theories accessible to a wide audience, throwing in plenty of fascinating anecdotes and case studies, from the Cuban missile crisis and the search for Osama bin Laden to Scotland's referendum on independence.
Philip Tetlock is a social and political scientist and the co-leader of a multi-year forecasting study called the Good Judgment Project. He's the Annenberg University Professor at the University of Pennsylvania and works in its psychology and political science departments, and at the Wharton School of Business. He's also the author of "Expert Political Judgment" and co-author of "Counterfactual Thought Experiments in World Politics."
Dan Gardner is a journalist and the author of "Risk: The Science and Politics of Fear" and "Future Babble: Why Pundits Are Hedgehogs and Foxes Know Best."
So keep listening to hear how to become a better forecaster – by dividing big problems into smaller ones, by updating your predictions when new information comes in, and by tapping into the collective wisdom of teams.
As you heard earlier, "Superforecasting" draws on decades of research into forecasting and the qualities and approaches shared by those who are skilled at predicting the future. The book starts with an explanation of this research, which is a necessary, if slow, beginning. We'll take a brief look at this first.
Tetlock has been involved in two large research projects. The first, called Expert Political Judgment, ran from 1984 to 2004. It asked a group of academics and pundits to make thousands of predictions about current affairs, from wars to elections to share prices.
The second began in 2011, when Tetlock and his wife Barbara Mellers asked volunteers from all walks of life to forecast the future, as part of what they called the Good Judgment Project. As a result, thousands of laypeople – from filmmakers to retired ballroom dancers – wrestled with complex global questions, like whether the price of gold would plummet or whether war would break out on the Korean peninsula.
The Good Judgment Project participants formed one of five teams taking part in a four-year-long forecasting tournament, set up by U.S. intelligence, to improve the quality of intelligence-based forecasting and understand what works and what doesn't. The Good Judgment Project outperformed all the other teams, including professional intelligence analysts.
How did Tetlock's group do it? Were the team members more intelligent than average or were they whizzes at probability and math? Well, they were intelligent and they were comfortable with numbers, but there were other reasons these "superforecasters" did so well. They used particular thinking styles and approaches to problems that enabled them to more accurately predict the future, and it's these techniques that are shared in this book.
One is an ability to break down a big question into lots of smaller problems. So let's take a look at this. Dividing a problem into many sub-problems was a skill practiced and taught by Enrico Fermi, an Italian American physicist who was a central figure in the invention of the atomic bomb.
Fermi once asked his students to estimate how many piano tuners there were in Chicago. They had two choices: they could make a very rough guess, or they could try to calculate different supporting data and get a more accurate result.
Fermi suggested asking what other information might be needed to answer the question. Knowing four facts would help: the number of pianos in Chicago, how often pianos are tuned each year, how long it takes to tune a piano, and how many hours the average piano tuner works each year.
But what if we don't have access to all that information? Well, we can split those four questions into what we know and what we don't know. We may have to guess at some of the answers, but we may be able to find the correct information for others. This approach also allows us to examine our guesswork more closely and challenge our assumptions, leading to more accurate estimates than a mere shot in the dark.
"Superforecasters" divide problems into small questions as part of their natural way of thinking. They break issues down, look at things from different angles, research all the options, and challenge their assumptions. And this is one of the reasons they prove so successful in their predictions, the authors say.
We like this approach and can see how it could be useful in business and in our personal lives, helping us take a more systematic approach to questions, rather than jumping to conclusions or making rough guesses. This method also means huge, intractable problems appear less overwhelming, making them easier to tackle.
Let's now look at how "superforecasters" use new information to update their predictions frequently and increase their accuracy.
Most "superforecasters" stay on top of the news. They set up Google alerts for the questions they're working on – for example, "Syrian refugees" or "price of oil." They digest new information as it comes in, and they update their forecasts accordingly.
But these days, the news doesn't stop, and we're constantly bombarded by headlines. So the real skill lies in knowing what's relevant and what's not. It's about striking the right balance between "under-reacting" and "over-reacting" to a change in the landscape.
Why might we under-react? Sometimes, we're so invested in a certain answer or outcome that we're reluctant to change tack or admit our prediction might be wrong. If we've committed to a belief publicly, it's even harder to change. So the best forecasters are those who are not wedded to any idea or agenda, and who are humble enough to change their minds when new data proves them wrong.
On the other hand, we might over-react if we allow ourselves to be swayed by irrelevant information. So we need to find the middle way between over-committing to a belief and being so uncommitted to it that we're easily thrown off track or misled.
The key is to assess the worth of new information, without losing sight of the value of the existing data. The best way to do this is to update often and bit by bit, while also being prepared to shift our beliefs more dramatically when there's a reason to, the authors say.
Savvy forecasters also have a knack for picking up subtle clues or small shifts in the landscape, rather than waiting for the headlines on the evening news. They're detectives who are skilled at reading between the lines and interpreting nuances.
We like this idea of assessing the importance of new information and finding the right balance between under- and over-reaction, and we think it's a skill all business leaders and managers would want to cultivate.
The authors also make a great point about the importance of conducting post-mortems. "Superforecasters" always assess how well they do, so they can do better next time. They make a prediction, measure their results, review, and revise. That's another good lesson for managers.
Let's now look at what we can learn about team dynamics from the Good Judgment Project.
Teams can make terrible mistakes. And the authors look at the dangers of Groupthink, which is when a desire for harmony or conformity leads to a bad decision. As an example, they point to the botched invasion of the Bay of Pigs on the south coast of Cuba, ordered by the administration of President John F. Kennedy in 1961.
But if teams can avoid Groupthink, they can be incredibly effective. The Good Judgment Project found that teams make more accurate predictions than individuals, provided those teams have diverse members and meet certain conditions. Teams need a shared purpose, a good culture of information sharing, a supportive atmosphere in which people can admit ignorance and ask for help, and clear ground rules that allow members to challenge one another's assertions.
Tetlock's team of "superforecasters" created this through frank discussion. At the beginning, they were dancing around one another, too scared to speak their minds. But after a while, everybody agreed to do away with excessive politeness and to welcome constructive criticism from teammates.
Team members also met face-to-face a few times, even though most of their interaction was online. This helped give them a greater sense of belonging and commitment to the group, which in turn meant they were willing to work harder and dig deeper in their forecasts. They helped one another raise their game, boosting the performance of the team as a whole.
Winning teams also have members who contribute more to others than they get in return, in other words, more givers than takers.
We like what Tetlock and Gardner say about team dynamics, and how they illustrate the pros and cons of working in groups with interesting case studies from politics and history.
The authors are clearly passionate about the potential impact of forecasting on politics, the economy, security, and global affairs, as well as on business and our individual lives, and the book ends with a call to arms. They want their work to contribute to a forecasting movement that bases its predictions on clear evidence, keeps score, analyzes its results, and learns from its mistakes. In this way, we could get better at avoiding economic crises, averting conflict, or preparing for terrorist attacks. And we could be more effective at fighting poverty or coming up with policies that help our communities.
Tetlock and Gardner do a great job of showing how forecasting has worked well, and not so well, across a variety of fields and throughout history. But we'd have liked more examples of how we can apply forecasting to our work and personal lives. We think this would make the book more accessible.
Another criticism is that "Superforecasting" gets off to a very slow start. We're about 100 pages into the book before we hear about the characteristics of "superforecasters," and the methods they use to make accurate predictions.
Some readers will find the authors' preamble interesting, but others will be itching for them to get to the point. This slow start means the book feels longer than it should be. And while some readers will enjoy all the numbers, statistics and acronyms, others might find them off-putting.
That said, the authors write with authority and include insightful interviews with high-level decision makers – from David Petraeus, former director of the CIA, to former U.S. Treasury Secretary Robert Rubin. They also do a good job of showing us that we don't need powerful computers, or the mind of a genius, to make good predictions. We just need to gather evidence from a wide variety of sources, test it, share it with others, keep track of our performance, and admit our mistakes.
We may not manage to become "superforecasters" like Tetlock's teams, but this book can help us make better, evidence-based decisions and predict more accurately what comes next.
"Superforecasting: The Art & Science of Prediction," by Philip Tetlock and Dan Gardner, is published by Crown Publishers, an imprint of the Crown Publishing Group, part of Penguin Random House.
That's the end of this episode of Book Insights. Thanks for listening.