- Content Hub
- Leadership and Management
- Decision Making
- Decision Making Essentials
- Think For Yourself
Access the essential membership for Modern Managers
Transcript
Rachel Salaman: Hello, I'm Rachel Salaman. We live in a golden age of information and expertise. But the wealth of data and opinion at our fingertips can be overwhelming, to the point where we let it take over. We can end up doing things that someone else – or even a computer – suggested, rather than choosing our own considered path. Is that a good idea?
Well, that's what we're talking about today with Harvard Professor Vikram Mansharamani, author of a new book called, "Think for Yourself: Restoring Common Sense in an Age of Experts and Artificial Intelligence."
Vikram joins me on the line from Lexington, Massachusetts. Hello, Vikram.
Vikram Mansharamani: Hello, Rachel. Thanks for having me.
Rachel Salaman: Thanks so much for joining us today. So, what prompted you to write this book?
Vikram Mansharamani: Well, it really started from a short piece that I wrote for the Harvard Business Review called "All Hail the Generalist." And what I heard, in terms of feedback from that piece, was that, in this land of specialists, we had found that the idea of "broad thinking" was being lost. And several of the readers of this piece came back to me and said, "Vikram, I really appreciate that you had suggested something different in this piece because we feel intellectually bullied by some of the experts that we interact with. We don't feel empowered to think for ourselves."
So that was one catalyst that came up to help me, that inspired me to pursue writing this book. The other one stems actually from a very unusual fact pattern that started with my first book, which was about financial bubbles.
And you might think that financial bubbles are very distinct and different, but the book I had written about financial bubbles indicated that using multiple lenses to navigate the uncertainty of financial chaos would prove useful – that actually it was required to keep the economist in his spot, but also look to regulators, understand psychology etc.
And what I found, which was really stunning, was I was giving a speech about financial bubbles and my framework for thinking about them at a large conference, and afterwards a person came up and said, "I'd love to keep in touch." I said, "Sure, here's my information." Six months later, that same person emailed me and said, "Vikram, I was diagnosed with having prostate cancer. And I used your framework for thinking about financial bubbles to help me navigate the medical uncertainties I faced. It was amazingly useful, empowering, it helped me think in a time where I was overwhelmed by expert and other opinions. Thank you so much."
And that was really empowering. So, those two data points came at me in a short period of time and led me to believe that there was something worth sharing in my way of thinking about the world and how to push back – or, more importantly, manage – expert involvement in one's life.
Rachel Salaman: Yes, it does sound like this topic needs a fair amount of nuance applied to it because it can't surely be the case that all experts are wrong. Some people might be thinking, "Why would it be smart, rather than stupid, to trust our own reasoning when there are experts out there who can tell us what to do?"
Vikram Mansharamani: Sure. Yes, I think Rachel you're getting at the critical message that I am trying to convey here, which is really: it is equally problematic to dismiss experts entirely as it is to blindly outsource your thinking to them. You need a more nuanced, more thoughtful approach to how we manage experts and technologies in our decision-making processes.
And what it really comes down to is, experts live within silos and have better information within those silos than we likely ever will. However, by living in those silos and having that depth of knowledge, they miss the context or the breadth of perspective that is really critical to long-term success in decision making. And it's our responsibility to maintain knowledge and awareness of the context in which that decision is being made.
And so, I don't think experts are malicious or poor intentioned. I think that they are structurally within their silos, narrow and deep, and the nature of problems requires us to have an appreciation for the broad. So, I think that's exactly right what you said, which is it requires a more nuanced approach. One in which... The way I describe it in the book is, we need to keep experts on tap but not on top.
Rachel Salaman: Yes, that's a really neat way of putting it. Do you think this topic is more important now compared to say five or ten years ago?
Vikram Mansharamani: I do. But I don't think it has anything to do with the nature of experts, I think it has to do with the nature of the problems and the context in which we find ourselves, the world we're in. And that has to do with a bunch of different things.
Just think about the number of choices at your fingertips for virtually every decision you would be faced to make. We've a plethora of options that have exploded onto our "decision-making plate," if you will. And so, I think the world has just gotten a lot more data, a lot more information, a lot more choice, and as a result many of us seek to optimize our decisions.
There's the promise of perfection. And that leaves us with this low-grade anxiety, constantly, that we could make better choices. This fear of missing out on that perfect choice, leads us often to the arms of experts. Those who filter for us, those who manage our focus, those who can help us optimize our decisions to that perfect choice, and that sends us into the arms of experts in technologies.
Rachel Salaman: So, if we were to resist that lure of experts, is there anything we can do about all the choice that we face every day, how can we filter it?
Vikram Mansharamani: I think the first thing to remember is, filtering is the same thing as saying ignoring some options. So what we're thinking about is controlling the spotlight – we shine a spotlight on specific spots when we focus or filter, and we just need to be cognizant of the fact that there's a lot of shadow, as well as that spotlight. And sometimes the best choices may exist in the shadows.
But, I think what you're getting at, Rachel, is that some decisions might be better without optimization strategies for our choice, but rather just something where we "satisfice" – satisfice is a term used by Herbert Simon, the Nobel Prize-winning economist – but really that we just choose a selection that's good enough.
Imagine selecting a movie. Do we really need to have a spreadsheet that looks at every actor, actress, their ratings, the Internet Movie Database, how it's compared, how critics reviewed it? Or do we just jump on and watch something that's fun and frivolous, and if it wasn't perfect for our mood, well so be it. Some choices don't need to be optimized.
So I think when we're drowning in information, one thing to think about is what are the stakes of the decision? And does this really have long-term consequences that we should spend the energy, effort and cost, in the form of time and/or money, to work on getting assistance to optimize, or get closer to an optimal decision?
Rachel Salaman: You mentioned earlier that in some circumstances, for example brain surgery, we absolutely should rely on experts. How can we know when we should seek out expert advice and when we should rely on our own common sense?
Vikram Mansharamani: There's two things I'll say here, Rachel. The first is: it really has to do with whether you know the problem, or you're just facing massive uncertainty. And there's different dynamics that can describe that.
I'll give you a personal example that might help illustrate this. I had some personal health issues, probably ten years ago, where I didn't know what was wrong with me, and so I went to a doctor. The doctor didn't know what was wrong with me. This was a general practitioner and so we did some tests and eventually I was told, "Go home and drink fluids." And, in fact as I say in the book, I don't think I've ever been to see a doctor and not been told to drink more fluids! But that's a separate point, it seems like a common refrain.
But eventually after several iterations, the doctor who admitted to not knowing said, "Vikram, let me send you to a specialist." And the alarm bells went off. And the alarm bells went off because I think that is exactly when we do not want to go towards an expert or a specialist. And that is when we don't know the problem – when we're still in the exploration stage of figuring out what we're trying to understand, then it seems inappropriate to go to a particular expert because the expert lives in a silo. And they will optimize a decision within that silo, but it may not be the right silo. So, in this particular case, I took a step back and I went and re-evaluated my approach to medicine and found a doctor that took a different approach, and we navigated that uncertainty successfully.
So, the point of that story, we need to know whether we have a problem that we understand, or whether we're seeking to understand what type of problem we have. And those are two different dynamics.
The other thing that this relates to is something that I talk about in the book called The Cynefin Framework, which is a framework developed by a couple of thoughtful consultants to describe the environment in which we are making decisions. And there are really four stages, or four environmental conditions, that they describe.
The first is a "simple environment." And a simple environment is one in which there are obvious causes and obvious effects – you do this, that happens. Think about a credit card balance, it needs to do some multiplication, your average daily balance times your interest rate, add that to the balance etc. These simple domains are ones that can be easily automated by technology and rules.
But then you can go very quickly from simple to something they call "complicated." Complicated is a dynamic where the cause and effect are real and identifiable, but may not be obvious to people. So, for instance, [in] most of medicine there are complicated environments – yes, there is something wrong, we just don't know what. Imagine your car breaks down, so you turn to a specialist, a mechanic, who can diagnose the problem through multiple layers of knowledge to get to that root cause-effect relationship which is stable. So that's another good domain where experts thrive.
But what happens when you go to the "complex"? And the complex is defined as an environment in which causes and effects are not stable, and [they're] constantly shifting and interacting. And here's an environment where seeing the big picture matters – you have to have a systems-level view of the problem in order to understand how the various parts interconnect, and how they affect each other.
So, what's a complex phenomenon? A financial bubble. That's one that I've spent a lot of time thinking about: there's psychology of lots of different people at work, there's regulators interacting based upon what happens from the people, investors act differently if there are large institutional investors where they're worried about client money versus individuals who might just be speculating. And so you have a whole bunch of interacting parts, and no one will be able to disentangle all of those into clear cause and effect relationships because they're all interacting simultaneously.
So that's a complex environment. And when we talk about a complex environment, that's where I think relying blindly on a single expert can mislead us. That's where what we may prefer to do is rely on multiple experts to triangulate, to use multiple lenses and multiple perspectives, because there is no correct answer, and we get probabilistic insight from each and every expert. So using multiple experts can help us there.
Rachel Salaman: How do we know how many multiple points of view to seek out?
Vikram Mansharamani: That's just a simple dynamic where I would suggest the answer is the more the better, but there is a cost to getting each and every expert. I could go see a psychologist to talk about financial conditions, I could go see a financial advisor, I could talk to a regulator, I could talk to a congressman, I could talk to my friends, I could talk to parents, I could talk to younger students of mine, etc. There's a cost to seeking each perspective and so what I would suggest is the more the better, however obviously you're constrained by how much time you can spend on this and, depending upon the stakes of the decision, you want to cut it off at some point.
Rachel Salaman: Should we assume that some points of view deserve more weight in the decision making than others, or should we see them all as equal?
Vikram Mansharamani: No. Here is one thing that I find really interesting, and it happens to be part of this managing experts problem. We often put certain folks up on a pedestal. And I'm not giving you specific names, but I'll use an example here.
So, I was watching one of the Sunday morning news shows that take place here in the United States – these tend to be a famous news host interviewing world-renowned experts on a host of topics, generally about current affairs. One of them recently had a Nobel Prize-winning economist on, talking about the economic impact of the shutdown. But, of course, the conversation drifts to when should we open up, what is the key, testing of the coronavirus, will antibody testing matter, do we need a vaccine, what's the likelihood, what's the timeline etc.
Now, I would suggest to you that a Nobel Prize-winning economist is a very informed, likely deep and narrow expert. They probably have other interests and other knowledge bases, I'm not suggesting economics is all they know, but surely this person doesn't deserve the credibility on matters of public health that a doctor might. And so, I think oftentimes we confound the credibility due to a person in one domain, with the credibility we give them in other domains.
You're listening to Mind Tools Expert Interviews from Emerald Works.
Rachel Salaman: You mentioned "focus" earlier and it plays quite a large part in your book. I wondered if you could elaborate a bit more on the perils of focus? Which is an original way to look at it.
Vikram Mansharamani: There's both promise and perils when it comes to focus. Focus is, in theory, our means for overcoming distraction. In fact, Microsoft Word, if anyone is using the word-processing software, they now have a "focused mode" to prevent you from seeing other icons on your computer screen etc. So, focus is thought to be a really good thing.
However, we need to go back to my analogy of spotlights and shadows. Focus is where the spotlight has been shone, that's where the spotlight is shining. And yet, because of it, we are putting a lot in the shadows. Focus is equivalent to filtering, and filtering is equivalent to ignoring. And so, when we focus, we ignore. And we need to be mindful and aware of what we're ignoring.
So, my only point around focus is we sometimes give up our focus inadvertently, to experts and technologies. We outsource our thinking, we outsource our focus, and as a result we're not as mindful of where that spotlight is being shined. And when we're not aware where the spotlight is shining, we're not aware of the shadows that may contain our answers.
Rachel Salaman: A lot of this is about being engaged, so it's not just about not outsourcing our decision making and our thinking, but it's not outsourcing the engagement in our own issues. And sometimes that takes an effort – we just have to expend more effort into these things.
Vikram Mansharamani: That's right, I think some of our problems here are that we become intellectually lazy, and we disengage. And, in fact, one of the chapters of the book is called "Mindfully Managed Focus," and it's intentionally designed for that explicit message here, which is you need to actually take control of the thinking process. You have to manage it, it requires effort, so you have to be engaged.
Rachel Salaman: The trouble is it's not always easy to take charge in a meeting with an expert, just because of the way society has taught us to behave with experts. What are some comfortable ways to do that?
Vikram Mansharamani: One of the easiest ways I find is, at the beginning of a relationship with an expert, to clearly articulate what it is you're seeking from that relationship.
Now there may be times where you're in an existing relationship with an expert – your expert doctor, your financial advisor, or what have you – and it becomes more difficult because there is a power dynamic. This person, your lawyer, has the impressive diplomas on the walls, it's a big fancy office, you are made to wait for a while before because their time is probably more important than yours etc. So, the whole cultural and physical experience lends itself to this belief that they are very important, and we are to be respectful patrons of their expertise.
And so, when we go at it that way, what I find sometimes helpful is to just terminate relationships. At the very beginning of my preface, I actually describe a process I've used which is "fire, aim and then re-hire." And I say maybe re-hire because sometimes we find if we start thinking about these issues, we may not need an expert.
I'll give you an example, I was working with my accountant to file my very complicated taxes, and there was a period of time where I was asking if they would run a scenario in this one way rather than the other way because I wanted to see what it meant. And they said, "Well, accelerated depreciations always make sense." And I said, "Maybe and maybe not. I am thinking I might sell that next year, or this or that, I don't know. Can you please run this scenario for me?"
And there was some pushback. They said, "Look, we know what we're doing here. We're telling you what we need to get done, and so this is what we need to do. We're very pressured for time, we don't have time to do that." I said, "That's fine." And the next year I fired that accountant, and I went and found a new one. And when I found the new one, I started off the relationship saying, "Listen, I am going to require from my accountant some help on understanding various scenarios before we file our taxes, this is something that I am going to want." Having re-set that relationship at the get-go, it was far more productive for both of us, and I managed that expert relationship far more efficiently, far more effectively than I would have had I continued to battle with someone that thought they had more control over me.
So, sometimes it takes a total re-set. However, most of us have a doctor that we're working with, we have an accountant that we're working with, and the question then becomes how can one nudge that relationship towards your thinking process?
One of the most innocuous ways to nudge, and most helpful ways, is just to ask the question of what if I didn't do that, what would happen? So, you want me to take this statin, what would happen if I didn't? You want me to put some money of my portfolio into bonds, what if I didn't? What are the risks, help me understand the possible scenarios?
That is one way. The other way is to present a handful of scenarios to that expert in conversation and say, "Let's say instead of taking this statin that I decided I was going to go train for a marathon and get healthy, I'm going to work out a lot more. Would you still then recommend I take a statin? Or let's say I visited a nutritionist, and I was going to change my diet very dramatically, and I'm going to adhere to it, and I would lose a lot of weight because I changed my diet. Would that change your recommendation?" So, it's also just presenting a couple of scenarios because the expert likely is assuming that you're going to continue on your normal path, and therefore is trying to optimize on certain things going on in your life that may or may not be true. Again, they don't see the context.
Rachel Salaman: In fact, you focus on self-reliance in the last part of your book, and how education is really important in training people to value a broad perspective. Can you talk about that?
Vikram Mansharamani: Sure. What I'm getting at here is a focus on the context over the silo. And so, what I found is that a large portion of at least higher education – and, by the way, in some parts of the world even earlier education – tends to channel children towards specific domains.
This person is showing some propensity in science, therefore they're going to be a physicist or a doctor. This person shows that they don't particularly like that, we're going to channel this person into a different profession, etc. I find that very problematic because in the domain of education it becomes important for us as citizens of a complex world, citizens that face lots of different issues that we want our governments to interact with, to be informed a little bit on all of them – to be able to understand which expert we should listen to, and which we shouldn't listen to. To know where to tap into deep knowledge, and where instead breadth of knowledge matters.
Because as we move forward in this world of complexity, there are experts generating lots and lots of dots. What becomes really valuable is understanding how to connect those dots, and most of the grand challenges – the big problems that we face in the world today – they cross these silos, they cross borders, they aren't defined by single disciplines. It requires more dot connecting, and that's why I think education should emphasize breadth of perspective, as well as developing some depth.
Rachel Salaman: And yet in this part of the book you also suggest that we celebrate ignorance. So what's the thinking behind that advice?
Vikram Mansharamani: What I'm suggesting there is oftentimes the way we learn, the way we make progress, is by adopting a beginner's perspective. You can argue ignorance is one way to describe it.
I have a couple of examples where I talk about someone that has had no experience in a particular domain, being really successful in that domain. And it has to do with: a beginner doesn't come with the baggage of long tradition, focused effort, in a domain where they don't ask questions about basic assumptions. A beginner may ask what are very naïve questions, a beginner may ask about some things that experts just take for granted.
Rachel Salaman: Confidence seems to be central to building self-reliance, especially in areas where experts dominate like law and medicine, the white-collar professions that we talked about earlier. What are some practical ways to build confidence in our own ability to weigh data and advice and then forge our own path?
Vikram Mansharamani: Yes, it's interesting even how you phrased the question, Rachel, because I do think confidence is a critical variable for us to pay attention to here. And one of the reasons why I like people to be broad, and to be a generalist in logic and approach, is because a generalist is less likely to be overconfident in a particular domain. Why is that?
When I'm a generalist, and I know a little bit about economics, a little bit about regulatory policy, a little bit about medical policy, a little bit about etc. – I know that there are others who know more than I do in those domains.
So, a generalist is one who is unlikely to dismiss new ideas that differ from their own. Someone with broad experience understands the contours of what they don't know. So that is important – not to be overconfident, so that you learn to tap into expertise when needed.
But the flip side is equally problematic, which is do you have too little confidence in interacting with experts? I think that is more how you phrased the question: practical ways to build confidence in our abilities, so that we can contextualize their insights and advice.
And here I think what's important is to understand the boundaries – the conditions under which the expert does have credibility and the conditions under which the expert might not be as well-grounded. And so, even just having a framework for interacting and managing that expert, is really critical.
It's not unreasonable for anyone to ask, "How do you know that? Why do you believe what you do?" In fact, I would argue that an expert that gets defensive when questions like these are being asked, is probably not an expert you want to work with. A true, thoughtful, contextualized expert giving advice should be able to answer some of these questions as to how they came to the conclusions they have, why they think what they think, and what the constraints are: what does this not tell us? How can we know where this doesn't apply? Why do we think this is going to work for me?
So, I think asking lots of questions is one way to build your confidence and manage experts. But also, it's important to understand the boundaries in which that expertise could be useful, and where perhaps it's not as valid.
Rachel Salaman: Looking ahead now, how hopeful are you that people will become more willing and better able to make autonomous decisions and think for themselves?
Vikram Mansharamani: I am actually pretty optimistic in the grand scheme of things, and I think part of that optimism comes from times where there's real challenges, real crises, and – for better or worse – facing a global pandemic has elevated this topic.
Crises are great times of uncertainty, and uncertainty is where we are forced to question who we're listening to, why, when and what we do with their advice. Ultimately, it's times of crises that lead us to think for ourselves.
So, I think I am optimistic that we'll re-think our relationship with experts, that we'll be better equipped and more mindful about our managing of experts, that we'll be more aware of where we allocate our scarce amount of attention and focus. And, hopefully, we'll be able to triangulate through multiple experts to make better decisions in the face of uncertainty.
Rachel Salaman: Vikram Mansharamani, thanks very much for joining us today.
Vikram Mansharamani: Thanks for having me!
Rachel Salaman: The name of Vikram's book again is, "Think for Yourself: Restoring Common Sense in an Age of Experts and Artificial Intelligence."
I'll be back in a few weeks with another Mind Tools Expert Interview from Emerald Works. Until then, goodbye.