- Content Hub
- Leadership and Management
- Decision Making
- Improving Decision Making
- Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts
Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts
by Our content team
Access the essential membership for Modern Managers
Transcript
Welcome to the latest episode of Book Insights from Mind Tools. I'm Cathy Faulkner.
In today's podcast, lasting around 15 minutes, we're looking at "Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts," by Carol Tavris and Elliot Aronson.
To err is human, so the saying goes. We all make mistakes. We get things wrong at home, at work, and in our relationships. We sometimes make poor choices and bad decisions. But the key question is, do we own up to our mistakes and face the consequences? Or do we "pull the wool over our own eyes" and convince ourselves we were right all along? Do we justify our words and actions, even if they cause harm, so that we feel better about ourselves?
It's natural to want to believe that we're smart, that we get things right, and that we're morally sound. We're wired that way. But this craving to feel and look good can have us heading down the wrong path, ignoring any evidence that tells us to switch course or turn back.
If you're struggling to recognize yourself in this description, perhaps it's easier to think of prominent political and business leaders throughout history who have made mistakes or done harm because they were too blind or stubborn to accept the facts – they found ways to justify their actions and to evade responsibility.
We're all prone to self-justification, whether our actions hit the headlines or go unnoticed, whether they cause mere discomfort or a lot of problems. But why is this, and how can we avoid falling into this trap? "Mistakes Were Made (But Not by Me)" has the answers.
This book provides a fascinating insight into human nature, explaining why people in all walks of life avoid owning up to their mistakes. It explores why our memory fails us, why we dig in our heels and refuse to change tack even when it makes sense, and why admitting we got it wrong is so hard. It offers strategies to help us overcome our blind spots and prejudices, increase self-awareness, and reduce the temptation to self-justify. The goal is to help us make better decisions, change course quickly when we get it wrong, and admit our mistakes so that we can preserve our relationships.
So, who's this book for? It's for anyone who wants to understand their behavior better, so they can make good choices and have better relationships, at work and at home. Those in leadership and management positions will especially benefit from this book, as their mistakes are more likely to affect others. It's packed with theory – there are reams of studies and research experiments – so it's not a light read. But this body of evidence gives it credibility. And there are plenty of anecdotes to liven it up, ranging from the entertaining to the shocking.
Authors Carol Tavris and Elliot Aronson are both experienced psychologists. Tavris works as a writer, teacher, and lecturer, educating the public and professionals about psychological science. Aronson is an award-winning professor whose research focuses on social influence. He tries to improve people's lives by prompting them to change their attitudes and behaviors. He's Professor Emeritus at the University of California, Santa Cruz.
So, keep listening to hear what's behind our tendency to self-justify, why we can't always trust our memory, and why it pays to admit it when we got something wrong.
"Mistakes Were Made (But Not by Me)" first came out in 2007. This second edition includes new research, fresh anecdotes from professions such as dentistry and nutrition, and examples of sectors that are taking steps to correct mistakes or end the working practices that were causing harm. These include criminal justice, health, and science.
There's also a new final chapter that explores the opposite of self-justification – when people can't turn a blind eye to their mistakes, and suffer remorse, guilt, and sleepless nights for years. This conclusion looks for a middle ground between blind self-justification and endless self-flagellation.
Before we go any further, let's take a closer look at this idea of self-justification. The authors pick out U.S. President George W. Bush as a prime example of a leader who chose a course of action based on a false assumption, ignored the evidence that proved him wrong, and then continued to justify his actions, refusing to admit his mistakes. They're talking about the invasion of Iraq in 2003, which was initially based on the claim that Saddam Hussein had weapons of mass destruction. None were ever found.
Bush also made other questionable claims during the Iraq war: he said the conflict would be over quickly, and that Saddam was linked to al-Qaeda. These assertions were discredited, but instead of admitting he might have got it wrong, he came up with new reasons for going to war. He argued he was getting rid of a "bad guy," fighting terrorists, and making America safer.
Now, your opinion on Bush and the Iraq war may differ depending on your politics, but the authors' point is that some leaders hold fast to a viewpoint despite mounting evidence to the contrary, often with damaging consequences. And they offer plenty of similar examples from history.
It's easy to point the finger at people in the public eye or in positions of power, whose mistakes are writ large. But what about ourselves? Some of us stay in relationships long past their sell-by dates, just because we've spent so much time trying to make them work. We remain in jobs that make us unhappy, focusing on our pension or job security in a way that blinds us to the possible benefits of moving on. Or we throw good money after bad, too proud to admit we made an unwise investment in the first place.
Self-justification is different from lying – it allows us to convince ourselves that what we did was the best thing we could have done. According to the authors, we justify ourselves because of "cognitive dissonance," so let's take a closer look at this process.
When we hold two ideas, beliefs or opinions that are inconsistent or contradictory, we end up in a state of tension that's unsettling. Social psychologist Leon Festinger coined a term for this – cognitive dissonance – more than 50 years ago. He infiltrated a group of people who believed the world would end in December 1954. What would happen when the prophecy didn't come true? Would they lose their faith in their leader, Marian Keech?
Festinger predicted that Keech's most fervent followers – those who'd sold their homes and given away their possessions – would increase their faith in Keech even if the world didn't end. He was right. When her prophecy failed to materialize, Keech claimed God had spared the world because of her followers' impressive faith.
They believed her more than ever, and went out to evangelize about her incredible powers. They'd found a way to justify their decision to trust her, despite the fact she'd got it wrong. This was more palatable than admitting they'd been foolish to believe her in the first place.
Most of us can't rest easy until we find a way to resolve the discomfort – or cognitive dissonance – we feel when our self-image is out of alignment with our actions or reality. The authors use a simple example of a smoker who holds two contradictory attitudes. One is: "Smoking is foolish because it can kill me." And the other is: "I smoke two packs of cigarettes a day." The easiest way for a smoker to reduce the dissonance is to quit smoking.
But what if he or she tried to quit and failed? Now the smoker must convince himself that smoking isn't that harmful, that it aids relaxation, helps prevent weight gain, and so on. Many smokers find ingenious ways to reduce dissonance and ease their discomfort, but really, they're deluding themselves.
Cognitive dissonance affects many areas of our lives, especially how we process information. If we come across new data that confirms our attitudes or beliefs, we welcome it and completely trust its accuracy. But if it goes against what we think, we'll shun it and find fault. This tendency is known as "confirmation bias."
Tavris and Aronson include a number of psychological experiments that show how cognitive dissonance impacts people's lives and decisions, as well as case studies from history and politics. Their goal is to raise awareness of how our innate desire to be right and consistent can lead us to justify courses of action we should have abandoned long ago – and to turn a blind eye to the facts.
The authors do a great job of explaining the perils of wanting to close the gap between who we think we are and what we've done. We imagine you'll come away from this chapter pondering some of your own decisions, and in the future you might be more willing to accept you've got it wrong.
Let's now look at the tricks memory can play on us.
Our memory can be incredibly powerful, detailed, and accurate. But it can also be distorted, and it can fail us entirely when it's convenient. When we want to be right, increase our self-esteem, rationalize bad decisions, or explain actions that aren't consistent with our self-image, our memory can help us out. We can confuse an event that happened to someone else with something that happened to us, or we can believe that we remember something that never actually happened.
Tavris and Aronson offer a catalog of anecdotes from people who say they suffered horrific or outlandish experiences, from surviving the Holocaust to being abducted by aliens, to surviving sexual trauma. But these experiences never happened to them. Why would someone make up something so harrowing?
People do this because it helps them make sense of their lives. It also enables them to avoid responsibility for their current circumstances. It allows them to resolve the uncomfortable dissonance between the belief that they're clever and capable, and the truth that their lives haven't quite turned out as they'd hoped. It lets them explain away problems, and to blame someone or something else.
These stories of distorted memories are fascinating and, in some cases, saddening and shocking. But what can we take from them? One lesson is to be aware that our memories are prone to distortion, and to check that we're remembering things correctly. Another is to be mindful of using potentially unreliable memories to avoid taking responsibility for our lives.
Let's now look at the benefits of admitting we got something wrong, even if it feels uncomfortable to do so.
Can you think of a time when someone you know has confessed to making a mistake – perhaps a boss, colleague, friend, or someone in the public eye? How do you feel toward these people? Has their admission of fallibility dented or increased your respect for them?
When we admit we messed up, rather than try to cover things up with excuses, we're often rewarded for our humility. There might be some people who are upset at first, but generally, others will like us more. We've all heard politicians or business leaders give half-hearted apologies that, in essence, pass the buck, and we've all longed for them to tell the whole truth.
It's also personally liberating to admit our mistakes, because hiding them can cause stress. By owning up, we're released from the fear of being found out, or the worry that one small mistake will snowball into something much bigger if we cover it up.
The authors suggest creating a culture, at work or in our home lives, where mistakes are accepted, and the truth is welcomed. If we're in leadership roles, they suggest welcoming criticism and even surrounding ourselves with naysayers – people who can challenge our prejudices and shine a light on our blind spots.
"Mistakes Were Made (But Not by Me)" is a warning to us all to be aware of our natural tendency to justify our errors to preserve our self-image. If we can take its lessons on board, we can make better, more conscious choices and avoid knee-jerk decisions. We can also be open to contradictory evidence and be willing to change our minds and admit we got it wrong before it's too late. In today's risk-averse culture, where saving face can trump all else, it's a valuable message.
We like this central argument, but the book does have its downsides. The authors take more than 300 pages to explore what is – in essence – one big idea. Self-justification is a fascinating topic, but the writing could have been more succinct.
Some readers will love the huge volume of studies, experiments, and stories, but we think some of them could have been left out. The book also focuses on American culture, history, and current affairs. More anecdotes from other parts of the world would have made for a more rounded read.
That said, the writers cover an enormous range of professions and life circumstances – from criminal justice and the law, to the medical profession, psychotherapy, conflict, and war. There's also a chapter on self-justification in marriage, which has useful tips for all personal relationships.
On balance, "Mistakes Were Made (But Not by Me)" is a real eye-opener. It encourages us to become aware of the tricks our mind and memory can play on us, and it challenges us to get honest with ourselves and with others. It shows us how to run our businesses and our lives mindfully, and with humility. We especially recommend it to politicians, business leaders and managers, whose decisions have a big impact on others.
"Mistakes Were Made (But Not by Me)" by Carol Tavris and Elliot Aronson is published by Mariner Books and Pinter & Martin Ltd.
That's the end of this episode of Book Insights. Thanks for listening.