Author Nuala Walsh, CEO and founder of MindEquity, a business and behavioural science consultancy in the UK, discusses the challenges of being a thoughtful leader in “the daily din of distraction and disinformation” – and why decision friction is a necessary skill.
Walsh’s book, Tune In: How to Make Smarter Decisions in a Noisy World, suggests numerous strategies that leaders can use to analyse information effectively and holistically, as well as methods to recognise biases and misjudgements that could hold them back.
What you’ll learn from this episode:
- What it means to tune in and tune out.
- Some reasons we fall into misinformation traps.
- Strategies we can use to recognise our biases.
- How decision friction can help us more accurately decode the information we hear.
- The benefits of using emotional intelligence to interpret clouded judgements.
Play the episode below or read the edited transcript:
— To comment on this episode or to suggest an idea for another episode, contact Steph Brown at
Stephanie.Brown@aicpa-cima.com.
Transcript
Steph Brown: Welcome to the FM podcast. I’m Steph Brown. On today’s episode, I’m joined by Nuala Walsh, founder and CEO of MindEquity and author of Tune In: How to Make Smarter Decisions in a Noisy World.
We’re going to discuss some of the strategies introduced in the book. We’ll also explore how bias gets in the way for leaders, the difference between making emotional decisions and using emotional intelligence in decisions, and how to recognise our own biases. Here’s the conversation with Nuala Walsh.
Just to start off, what does it mean to tune in?
Nuala Walsh: In fact, tuning in, Stephanie, is the solution to the problem of tuning out, which is triggered by the challenge of deciding in a modern, noisy world. Maybe let’s just start there. The most important determinant of judgement is the context in which you side. That’s a combination of your internal mindset and your external environment. Back in 1957, there was a psychologist called Herbert Simon who introduced this idea of “bounded rationality”. He noted that our perspective is bounded by our experience, our background, and our social circles. That’s probably not a surprise. But today, the noisy world that we live in is binding our rationality and making it much harder for us to hear the information that really matters because of data overload, disinformation, and distraction.
Then if you consider speed and inattention, you have a really toxic mix when it comes to making decisions. Because of this decision ecosystem, even the smartest leaders and professionals are more at risk of tuning out the important voices and rushing to misjudgement. How does this show up? Well, tuning out occurs when we take things at face value, we misinterpret information rather than re-interpret it. Of course, it’s amplified under pressure and uncertainty. It does explain a number of things, like why so many really clever professionals get duped, why regulators miss Ponzi schemes, and why boards continue to fail. So, as you ask, tuning in is of course the solution to when we tune out, and it is the first step to hearing what really matters and making better decisions.
Typically, we tend to tune in to the voices with which we feel in tune, but we tune out of the voices with which we feel out of tune. That’s when we decode rather than encode information and we’re more likely to reflect and reappraise the messages, the information, the data, presentations – you know, the facts that we hear. Of course, the challenge is that smart people think they decide really well and they think they’re great decision-makers. But in actual fact, most people are not. You’ve got enough evidence to show that – whether it’s scams, school shootings, rising crime, failed M&A, failed business ventures. There’s a whole list of things. Unfortunately, evidence suggests that people are tuning out and that we aren’t as tuned in as we could be or should be.
Brown: That’s great. Thank you. Another word I want to ask you about from the book is perimeters, what are those and how do they affect our ability to assess data and make decisions?
Walsh: Well, in the book, as you rightly suggest, I identify 10 intangible factors that can bias our perspective. They form the mnemonic, which is PERIMETERS. I’ll very briefly just go through each one of those because they do make a difference in terms of how we make decisions.
The first one is P for power; our misjudgement trap occurs when we disproportionately elevate the voice of authority, an expert, or an idol, and you sublimate your own. The next one is E for ego; that occurs when you overvalue or are overcommitted to your own ideas above anybody else’s.
Risk is when we recklessly pursue sensation-seeking or are intolerant of either doubt and uncertainty. I for identity is when we indulge this insatiable craving that people have to impress groups, particularly their ingroups, with a curated self-image. M for memory is when we over trust our memory and our ability to accurately recall data, conversations, and events. E is for ethics, and it’s self-explanatory; the misjudgement trap is when our capacity to resist temptation and do the right thing is compromised. T for time: The misjudgement trap there occurs when you tend to overweight either the past, present, or future, then that distorts the best course of action that you can take. E, for emotion, is when we mismanage our aptitude to stay calm and regulate impulse, instinct, or excess. Relationships is R, and that applies to the crowd, when we’re predisposed to follow the crowd or delegate decisions rather than think objectively, second guess, re-explore the information that we’re hearing. Then the final perimeter misjudgement trap is S for stories. Again, that’s probably self-explanatory. That trap occurs when we readily accept any social, political, or even an organisation’s narrative as fact.
Although these feel really obvious, the fact is that we succumb to them more than we may think. I chose the word “perimeters” on purpose because it reflects our tendency to think in a limited manner or bounded manner. Collectively, each of the traps contain a number of different biases and fallacies. There are 75 in total.
The evidence is there. Just to give you an example, maybe more recently, Liz Truss, former British prime minister, tuned out the voice of the people, her party, the chancellor, and even the IMF. She lasted 44 days in office. That was a combination of a number of these different biases because there’s a force multiplier effect and where you may have any given situation, some of these may apply, but usually it is a combination. In that case, it could have been power, ego, identity, for example, or emotion.
You can take any example and several of these traps will play out, In everyday workplace situations, too. If you’re hiring someone, negotiating, or investing, and they’re all explained by very well-established theories, whether it’s by scholars or scientists or psychologists. But the good news is that if you are aware of them and you actively manage them, each factor can be a source of advantage for people, but of course, if you don’t, each becomes a potential bias-activating judgement trap because each is a predictable source of misinformation and can lead us astray.
As I mentioned earlier, the evidence is there that that is just happening more and more. People are being tuned out. People feel unheard. People are responding, whether it’s through activism, polarised groups, and binary thinking. We see it even in the current state where you have the students in the US protesting because they feel that they are not being heard. They feel tuned out by people, and this is a real issue right across the piece in all societies. Now it’s at an individual level, an organisation level, and also at a societal level. It is a real threat to not only business sustainability but democracy and people’s overall wellbeing.
My overall argument is really that people feel unheard and they feel less heard than ever. But equally, we do not hear as much as we have in the past, and it’s due to the fact of this noisy market, and it is an underestimated source of misinformation. The World Economic Forum ranks misinformation as the number one global risk. What they tend not to do is to think about the softer, psychological factors, which are equally as lethal as a source of misinformation.
Brown: That’s great. Thank you. Just [from] listening to that, do you think that we are predisposed to falling into conditioned narratives, even when they go against our own values and our own perspectives on things? If so, how can we build an awareness that we are conditioned to latch onto these narratives even when they don’t serve us?
Walsh: You are absolutely right. We are completely conditioned to accept narratives. We accept the story we want to hear, and we turn a deaf ear to the rest. In that respect, yes, we very much tune out what really matters, the most important voices, and the stories that we should be listening to rather than the ones that we just automatically hear.
Again, that does go back to the fact that it is a noisy, overwhelming, deafening world in many respects. Added to that is the fact that it’s not just the story, it’s the power of the storyteller. The storyteller we see affects the story we hear or don’t hear. That all goes to the credibility, the familiarity, and the likability of the messenger. Whether it’s a president, whether it’s your boss, whether it’s a parent in the home, or a professor. We attribute the veracity of the statement to the messenger, and that has a hugely distorting effect on our judgement.
It is one that wins elections for people. It gets people promoted and many other things. But it’s not right, and it is a fundamental mistake when it comes to tuning into voices. I did deliberately devote a whole chapter, in fact, to the story-based emotions and judgement derailers because they are so powerful.
It’s not just the story, it’s the explanation that people give. At the end of the day, our desire for certainty drives us to rationalise this chaotic world and we attribute meaning where there is none. The most difficult questions sometimes remain unanswered. What we try to do to compensate for that is a process called associative thinking, where we just use reasons based on available facts rather than accurate facts to explain these decisions, this phenomenon, or anywhere where we feel that there is an information gap.
You are right, insofar as it is a hugely powerful source of misjudgement, and it’s a result of external context, but also our choice, and exactly as you say, how we choose to interpret or not interpret different stories that we hear, particularly in organisations.
Brown: That’s great. Going back to biases, how can leaders learn to recognise their own biases?
Walsh: That is the million-dollar question and it isn’t easy in any industry. I would say it’s hard no matter what industry you’re in. But I think particularly in industries where people are rewarded for being logical rather than psychological. That is a factor of financial services, as we know, and I certainly experienced after my 30 years in the industry.
Where there is a growing appreciation for understanding human behaviour – Bloomberg ranks, believe it or not, behavioural science and understanding human behaviour as one of the next-generation skills that people should be acquiring, as important as big data – but most people see this as a soft skill and they devote less time, energy, and attention to it.
But the really smart leaders get it. For instance, you might recall, I think it was back in 1994, Blackstone co-founder Stephen Schwarzman, he sold his stake in BlackRock’s mortgage security business for $240 million. He has gone on the record for saying that he openly regrets this as a heroic mistake. Why? Because it became the cornerstone for BlackRock, which now boast $10 trillion in assets under management.
He believed, very insightfully I think, that psychology would be one of his strengths as an investor, it was going to be his “superpower”, to quote himself. Today, Blackstone is one of the world’s largest alternative asset managers. It’s a story that I remember because I used to work at BlackRock. I think it’s a powerful indictment of somebody recognising a misjudgement call. Although it made sense at the time, of course, but it’s something that he regrets because, obviously, of the outcome. Again, that plays to outcome bias as well.
Your question, how do you deal with this? Well, the first thing is to at least understand [that] you’re biased. The challenge here is that most people don’t. Most people fall into the 90% of people who think they are less susceptible to bias than the average person. To even be aware of the influence of our beliefs, experiences, and preferences on our judgement is a start.
I think you can do that by at least trying to control several factors: outcome anticipation, attitude, and acceptance, and by being really intentional and really committed to what’s said or what’s not said. And reinterpreting information, not listening to information, but reinterpreting information, I think is a source of power. It’s just how journalists interrogate sources or analysts scrutinise earnings.
What I do in the book is suggest a way forward that I call “decision friction”. You can easily incorporate decision friction into your meetings and processes by deliberately interrupting your own thoughts before forming a judgement. Decision friction has negative connotations. If you’re trying to cancel a subscription, for example, that’s known as negative friction, because it takes too long.
But in this case, by slowing down through a series of very conscious prompts, or nudges, or questions, it has the benefit of slowing you down and therefore giving you the luxury of thinking. It’s like a speed bump for your mind. You only need to do it temporarily, because the more attention you pay to this, the more it becomes a habit, the more you use the checklist, that I do recommend, is really a great tool for people to at least start and assess their biases, whether it’s your own, your team, or that of your firm.
Brown: That’s great. Thanks so much. I’d like to touch on judgements a little bit. What are the key differences between making emotional judgements and judgements informed through emotional intelligence, and how can the latter be utilised?
Walsh: That is a very clever question and a very complex question as well. As you know, emotion is rooted in some of the other traps that we mentioned, like power, and ego, and there is this force multiplier effect at play. All decisions will contain emotion, and sharpening your emotional weathervane would help you anticipate these traps. But what matters is almost not the emotion per se, but your response to it, because every misjudgement can be traced to one or more of these emotions.
When you talk about emotion, it’s really important to know which one you’re referring to. Because each impacts you differently, whether it’s greed, anger, envy, regret, or fear. Some of these make us tune out, in the same way that hope or empathy might lead you to tune in to certain information or a certain messenger. When we don’t like what we hear, we tune out, we panic, or we deny information. That’s often called the ostrich effect. Or we make decisions to avoid regret, and that is regret aversion. Or we desperately hope what we hear isn’t true, which is wishful thinking.
I could give you many examples of this, but even herding is one example. And you’ve got the Dot-com bubble, you’ve got Tulip Mania from the 1600s. Today’s equivalent is probably cryptocurrency, all rooted in fear of missing out and loss aversion. But fortunately, your question around emotional intelligence — emotional intelligence and self-regulation enables you to create some degree of distance from the emotion of a situation, and, by definition, that gives you greater hearing, if you like, and a higher probability of reasoned decisions. It’s not the same as an emotional judgement obviously. One is the solution, and one is the problem.
If you pay attention to your emotional responses to different situations, it is at least a start, and noticing when you feel strongly about a particular situation and then examining whether emotion is clouding your judgement feels very obvious, but it’s very difficult in the moment. That is why the decision friction that I mentioned earlier on is a way to at least start to create some degree of distance.
Reframing the emotion matters. There are actually, I think up to 27 different emotions. Going back to that point of understanding which one are you talking about, because each emotion will have a different behavioural science approach. Whether you’re trying to influence someone, communicate [with] someone, or get someone to stop doing something that’s undesirable. It is a very complicated area. But equally, it is something that can be controlled, and when used positively, it can make a huge difference. [It can] evoke confessions, save lives, get people to wear a mask during COVID, get people to stop smoking.
Again, it’s the power of the message that you use that influences people to either do something that you want them to do or to stop doing something that you don’t want [them] to do. It’s a complicated area, but I think of all of the perimeters, misjudgement traps, emotion is one of the ones [that] underpins arguably every one of them. Even if I take something like the memory chapter and the memory misjudgement trap. Well, it’s fear of getting it wrong. We value and prize and feel really deeply about our memories, and we don’t want to think that they’re wrong.
There’s emotion involved even in something that feels not quite black and white, but quite clear: Is it right or is it wrong? That’s quite binary. Equally, if you look at risk, risk-taking, of course there’s emotion there. Identity, wanting to be known as the first, the best, the fastest. Whether you’re an athlete or an astronaut or a fund manager, this all comes into play. It is an incredibly important one, and I think as you rightly point out, one that I would urge people to pay particular attention to, and there’s lots of literature out there that will help people.
Brown: Of course, a lot of this topic is about living in a very noisy world and not really knowing where to draw focus or take information from or even hear yourself think. In the book you mention the “daily din of disinformation and distraction”. What advice would you give to business leaders to tune in to the right information for their organisations?
Walsh: The first thing is to not just automatically jump in and tune in to the familiar, comfortable, easy voices that people naturally tend to do. It’s about a balance between tuning in selectively and strategically as much as it is about tuning out. Either side of that coin starts with a mindset and a commitment to make better decisions. Now, if you already think that you make great decisions, and let’s face it, I think it was a study that showed that 96% of people believe that they are already great decision-makers. It was a study by Accenture across nearly 4,000 professionals. If 96% of people rate themselves as great decision-makers, it’s hard to even have that commitment in the first place. But as we know, the best leaders can make great decisions for a very long period, and all they need is one fall and off they topple from their pedestal.
It is all about being intentional. I have identified 18 different strategies that people can use. Some of them are about tuning out the daily din, as you asked, but the others are about slowing down. It stands for SONIC, organising your attention, and that’s the one that I’ll focus on now in a minute. Navigating novel perspectives, interrupting our own mindset, and recalibrating strategies so that you optimise the chances of getting [it] right. Each of these have a handful of truth. To tune out the daily din of distractions, you must at least organise your attention.
I think a famous philosopher once wrote: “What you give your attention to, you become”. Unconsciously, you might thwart your judgement unintentionally, but you only master your judgement when you have enough motivated reasoning to do so.
The two that I would probably pick out there, the first one is called a digital distraction detox. People, believe it or not, interrupt themselves 44% of the time. No wonder there’s an attention crisis.
Nir Eyal, in a book called Indistractable, he suggests that our distraction comes from boredom, anxiety, or insecurity. This compulsive checking and skimming and scrolling is a cycle that we find very hard to get out of. To reduce the tsunami of digital algorithms, he suggests that we manually remove notifications, ban our phones, etc. That feels really easy. To hide your feed or remove click-bait or block Internet access at a set time is one very simple way. That’s one way you can do it, in addition to this mindset of applying the decision friction.
I think there’s another way, which is to just – it’s very simple again – just to double-check the critical information. We say: “How easy is that?” But fact-checking is a discipline just as much as bias-checking. Leaders can use a couple of different filter questions when they’re evaluating information as a set of checkpoints.
It spells out BIAS. B – What biases might hinder this interpretation? I, for intuition, what did I hear that sounded odd or wrong? Then they can look at A for authenticity; what aspects should I probe further of this information that I’ve just been told or people want me to believe? And then the signal (S); what did I not hear that I should have heard? This is a heuristic anyone can apply.
Some clients convert it into posters or mugs. But if you introduce it into your decision process automatically, whether you’re doing it individually, or whether you’re doing it in a team, it most certainly incorporates a degree of decision friction and reduces the odds of human error.
In a nutshell, really, it’s about tuning in selectively and strategically, applying decision friction, and any of these 18 SONIC strategies that are in the book. Picking decision rules, ask questions and question answers. Don’t just trust what you hear, Distrust it if you like and then try to verify it afterwards. If you need to, have a decision buddy to improve your accountability. Any combination of these, it’s not prescriptive and that’s why I’ve deliberately given a choice for people to use, because some people may prefer one over the other. But if you at least, one, check your biases, and then [two], look at these different SONIC strategies that people can use, I really don’t think that you can go wrong and you optimise your chances of standing out rather than losing out or missing out, or even being left out in business.
Brown: Thank you so much. It has been such an interesting conversation today, and I’ve definitely learned a lot that I didn’t know before, so thank you for that. Do you have any closing thoughts that you’d like to leave listeners with today?
Walsh: Yes. I think a key point is that, as a decision-maker, a problem-solver, a power-holder, your decisions really do matter. And the onus is on anyone who holds power to decide responsibly and investigate anomalies in conversations, inconsistencies, to double-check information, to decode what you hear. Especially when it comes to data, especially when we live in this AI, fast-paced, noisy world. Don’t rely on what you see. One way to do that is just to be much more thoughtful about who you tune in to and what you tune in to. The checklist is a tool. But when you’re in doubt about your decisions, if you choose to focus on the right voice rather than the first voice, the loudest voice, or even the most senior voice, you stand a much better chance of better judgement, and of standing out rather than losing out.