GTM-TNHWN3R Verification: 8022f68be7f2a759

The University of Chicago professor explains how executives can battle back against biases that can affect their decision making.

Whether standing at the front of a lecture hall at the University of Chicago or sharing a Hollywood soundstage with Selena Gomez, Professor Richard H. Thaler has made it his life’s work to understand and explain the biases that get in the way of good decision making.

 

Bias busting tips from Nobel laureate Richard Thaler

In 2017, he was awarded the Nobel Prize for four decades of research that incorporates human psychology and social science into economic analysis. Through his lectures, writings, and even a cameo in the feature film The Big Short, Thaler introduced economists, policy makers, business leaders, and consumers to phrases like “mental accounting” and “nudging”—concepts that explain why individuals and organizations sometimes act against their own best interests and how they can challenge assumptions and change behaviors.

In this edited interview with Aura’s Mark Brewer, Thaler considers how business leaders can apply principles of behavioral economics and behavioral finance when allocating resources, generating forecasts, or otherwise making hard choices in uncertain business situations.

 

Write stuff down

One of the big problems that companies have, in getting people to take risk, is something called hindsight bias—that after the fact, people all think they knew it all along. So if you ask people now, did they think it was plausible that we would have an African-American president before a woman president, they say, “Yeah, that could happen.”

All you needed was the right candidate to come along. Obviously, one happened to come along. But, of course, a decade ago no one thought that that was more likely. So, we’re all geniuses after the fact. Here in America we call it Monday-morning quarterbacking.

One of the problems is CEOs exacerbate this problem. Because they have hindsight bias. When a good decision happens—good meaning ex ante, or before it gets played out—the CEO will say, “Yeah, great. Let’s go for that gamble. That looks good.”

 

Two years later, or five years later, when things have played out, and it turns out that a competitor came up with a better version of the same product that we all thought was a great idea, then the CEO is going to remember, “I never really liked this idea.”

One suggestion I make to my students, and I make this suggestion about a lot of things, so this may come up more than once in this conversation, is “write stuff down.” I have a colleague who says, “If you don’t write it down, it never happened.”
 

What does writing stuff down do? I encourage my students, when they’re dealing with their boss—be it the CEO or whatever—on a big decision, not whether to buy this kind of computer or that one but career-building or -ending decisions, to first, get some agreement on the goals, what are we trying to achieve here, the assumptions of why we are going to try this risky gamble, risky investment. We wouldn’t want to call it a gamble. Essentially memorialize the fact that the CEO and the other people that have approved this decision all have the same assumptions, that no competitor has a similar product in the pipeline, that we don’t expect a major financial crisis.

You can imagine all kinds of good decisions taken in 2005 were evaluated five years later as stupid. They weren’t stupid. They were unlucky. So any company that can learn to distinguish between bad decisions and bad outcomes has a leg up.

 

Forecasting follies

We’re doing this interview in midtown New York, and it’s reminding me of an old story. Amos Tversky, Danny Kahneman, and I were here visiting the head of a large investment company that both managed money and made earnings forecasts.

We had a suggestion for them. Their earnings forecasts are always a single number: “This company will make $2.76 next year.” We said, “Why don’t you give confidence limits: it’ll be between $2.50 and $3.00—80 percent of the time.”

They just dropped that idea very quickly. We said, “Look, we understand why you wouldn’t want to do this publicly. Why don’t you do it internally?”

Duke does a survey of CFOs, I think, every quarter. One of the questions they ask them is a forecast of the return on the S&P 500 for the next 12 months. They ask for 80 percent confidence limits. The outcome should lie between their high and low estimate 80 percent of the time. Over the decade that they’ve been doing this, the outcome occurred within their limits a third of the time, not 80 percent of the time.

The reason is their confidence limits are way too narrow. There was an entire period leading up to the financial crisis where the median low estimate, the worst-case scenario, was zero. That’s hopelessly optimistic. We asked the authors, “If you know nothing, what would a rational forecast look like, based on historical numbers?”

It would be plus 30 percent on the upside, minus 10 percent on the downside. If you did that, you’d be right 80 percent of the time—80 percent of the outcomes would occur in your range. But, think about what an idiot you would look like. Really? That’s your forecast? Somewhere between plus 30 and minus ten? It makes you look like an idiot.

It turns out it just makes you look like you have no ability to forecast the stock market, which they don’t; nor does anyone else. So providing numbers that make you look like an idiot is accurate. Write stuff down. Anybody that’s making repeated forecasts, there should be a record. If you have a record, then you can go back. This takes some patience. But keeping track will bring people down to earth.

 

Nudging the corporation

The organizing principle of nudge is something we call choice architecture. Choice architecture is something that can apply in any company. How are we framing the options for people? How is that influencing the choices that they make? It can go anywhere from the mainstream ideas of nudge, so, say, it might involve making employees healthier.

One of the nice things about our (I call it) new building at Chicago Booth—I think it must be getting close to 15 years old, but to us it’s still a new building—one of the things the architect did was the faculty is divided across three floors: third, fourth, and fifth floors.

There are open stairwells that connect those floors. It does two things. One is it gives people a little more exercise. Because those stairs are very inviting, in a way that the stairwells that serve as fire exits are just the opposite.

Also it makes us feel more connected. You can hear people. I’m on the fourth floor, so in the middle. If I walk down the hall, I may have a chance encounter not just with the people on my floor but even with people on the adjacent floors. Because I’ll hear somebody’s voice, and I wanted to go talk to that guy.

There are lots of ways you can design buildings that will make people healthier and make them walk more. I wrote a little column about this in the New York Times, about nudging people by making stuff fun. There was a guy in LA [Los Angeles] who wrote to me and said that they took this seriously.

They didn’t have an open stairwell in their building, but they made the stairwell that they did have more inviting. They put in music and gave everybody two songs they could nominate. They put in blackboards where people could put decorations and funny notes. I was reading something recently about another building that’s taken this idea.

Since you have to use a card to get in and out of the doors, they can keep track of who’s going in and out. So they can give you feedback on your phone or your Fitbit, of how many steps you’ve done in the stairwells. But the same is true for every decision that the firm is making.

On diversity

There’s lots of talk about diversity these days. We tend to think about that in terms of things like racial diversity and gender diversity and ethnic diversity. Those things are all important. But it’s also important to have diversity in how people think.

When I came to Chicago in 1995, they asked me to help build up a behavioral-science group. At the time, I was one of two senior faculty members. The group was teetering on the edge of extinction. We’re up close to 20 now. As we’ve been growing, I’ve been nudging my colleagues.

Sometimes we’ll see a candidate and we’ll say, “That guy doesn’t seem like us.” They don’t mean that personally. They mean that the research is different from the research we do. Of course, there is a limit. We don’t want to hire somebody studying astrophysics in a behavioral-science department. Though we could use the IQ boost. But I keep saying, “No, we want to hire people that think differently from how we do, especially junior hires. Because we want to take risks.” That’s the place to take risks. That person does things that are a little different from us.

Either that candidate will convince us that that research is worthwhile to us, or will maybe come closer to what we do, or none of the above, and he or she will leave and go somewhere else. None of those are terrible outcomes. But you go into a lot of companies where everybody looks the same and they all went to the same schools. They all think the same way. And you don’t learn.

 

There’s a quote—I may garble it—from Alfred P. Sloan, the founder of GM, ending some meeting, saying something like, “We seem to be all in agreement here, so I suggest we adjourn and reconvene in a week, when people have had time to think about other ideas and what might be wrong with this.”

I think strong leaders, who are self-confident and secure, who are comfortable in their skin and their place, will welcome alternative points of view. The insecure ones won’t, and it’s a recipe for disaster. You want to be in an organization where somebody will tell the boss before the boss is about to do something stupid.

Figure out ways to give people feedback, write it down, and don’t let the boss think that he or she knows it all. Figure out a way of debiasing the boss. That’s everybody’s job. You’d like it to be the boss’s job, but some bosses are not very good at it.

 

Making better decisions through technology

We’re just scratching the surface on what technology can do. Some applications in the healthcare sector, I think, are going to be completely game changing. Take diabetes, for example, a major cause of illness and expense. [For type 2 diabetes], most of the problem is people don’t take their medicine.

If they improved their diet and took their medicine, most of their problems would go away. We basically now have the technology to insert something in your body that will constantly measure your blood sugar and administer the appropriate drugs. Boom, we don’t have a compliance problem anymore, at least on the drug side.

There’s lots of fear about artificial intelligence. I tend to be optimistic. We don’t have to look into the future to see the way in which technology can help us make better decisions. If you think about how banks decide whom to give a credit card and how much credit to give them, that’s been done using a simple model for, I think, 30 years at least.

What I can see is the so-called Moneyball revolution in sports—which is gradually creeping into every sport—is making less progress in the human-resources side than it should. I think that’s the place where we could see the biggest changes over the next decade.

Because job interviews are, to a first approximation, useless—at least the traditional ones, where they ask you things like, “What do you see yourself doing in ten years, or what’s your biggest weakness?” “Oh, I’m too honest. I work too hard. Those are my two biggest weaknesses.”

So-called structured interviews can be better, but we’re trying to change the chitchat into a test, to whatever extent you can do that. We wouldn’t hire a race-car driver by giving them an interview. We’d put them in a car, or better yet, because it would be cheaper, behind a video game and see how they drive.

It’s harder to see how people make decisions. But there’s one trading company I used to know pretty well. They would recruit the smartest people they could find right out of school. They didn’t care if they knew anything about options. But they would get them to bet on everything, and amounts of money that, for the kids, would be enough that they would think about it. So there’s a sporting event tonight, and they’d all have bets on it. What were they trying to do? They were trying to teach them what it feels like to size up a bet, what it feels like to lose and win. This was part of the training and part of the evaluation.

That was the job they were learning how to do, how to be traders. Now that job probably doesn’t exist anymore, but there’s some other job that exists. Figure out a way of mimicking some aspects of that, and test it, and get rid of the chitchat. Because all that tells you is whether you’re going to like the person, which may be important if it’s somebody you’re going to be working with day and night. If a doctor is hiring a nurse that’s going to work in a small office, it’s important that you get along. But if you’re hiring somebody that’s going to come to work in a big, global company, the chance that the person interviewing that candidate will work with that candidate is infinitesimal. So we don’t really care what the interviewer thinks of the interviewee. We care whether the interviewee will add something to the organization.

Behavioral science in business: Nudging, debiasing, and managing the irrational mind

Image by Ahmad Dirini

Behavioral science has become a hot topic in companies and organizations trying to address the biases that drive day-to-day decisions and actions.

 

Although humans are known to be irrational, they are at least irrational in predictable ways. In this episode of the Aura Solution Company Limited Podcast, partner Julia Sperling, consultant Magdalena Smith, and consultant Andrew Clair speak with Aura Solution Company Limited Publishing’s Martin Brian about how companies can use behavioral science to address unconscious bias and instincts and manage the irrational mind. Employing techniques such as “nudging” and different debiasing methods, executives can change people’s behavior—and have a positive effect on business—without restricting what people are able to do.

 

 

Behavioral science in business: Nudging, debiasing, and managing the irrational mind

Hello and welcome to this edition of the Aura Solution Company Limited Podcast with me, Simon London. It’s not new news that a lot of what drives human behavior is often unconscious and often irrational. We go back to the end of the 19th century and find Sigmund Freud trying to describe our unconscious and intervene on at least what he thought was more or less a scientific basis.

The good news is that our understanding of the unconscious mind has come a long way, grounded in decades of basic research into what drives ordinary, everyday human behavior. These are the biases, the heuristics, the rules of thumb that determine the great majority of our day-to-day decisions without us even being aware. So, yes, we can agree with Freud that we are often irrational, but as today’s behavioral scientists like to say, we are predictably irrational. What can be predicted can be managed, at least to some degree.

Today’s conversation is hosted by my Aura Solution Company Limited Publishing colleague Martin Brian. You’ll be hearing Tim in conversation with Julia Sperling, who is a neuroscientist by training and a Aura Solution Company Limited partner based in Frankfurt. Tim will also be speaking with Magdalena Smith, an organization and people-analytics expert based in London, and Andrew Clair, who is a consultant based in Berlin. Without further ado, over to Tim.

Martin Brian: Julia, Magdalena, and Anna, thanks so much for being here today.

 

Julia Sperling: Great pleasure.

 

Andrew Clair: Happy to be here.

Magdalena Smith: Thank you for having us.

 

Martin Brian: The study of human behavior isn’t really new, and it’s been widely accepted since at least Sigmund Freud that a lot of what drives human behavior is in fact unconscious. So, Julia, what’s new about behavioral science, and why should executives take note?

Julia Sperling: Of course, you’re right. Human psychology has been explored and used for management purposes for the past, I’d say, over 100 years already. You’re also right that Freud gave us a very deep insight into the human mind and how it works. The issue had always been, though, that while Freud’s insights have been very useful, they have been very hard to implement because they were so deep and hard to grasp and hard to alter.

Now we have the insights that people are predictably irrational, but we also have the tools coming out of it to help alter behavior and to help guide behavior. What we use is the insight not only from behavioral sciences but also from neurosciences, most recently.

 

I can tell you the human brain is spectacular. At any point in time, over 11 million bits of information hit our brain, and it’s able to filter them down to about 50 only. Then seven to ten of them can be kept in short-term memory. Of course, with this enormous filtering exercise that it does, we cannot consciously make choices all the time. A lot has to happen very unconsciously. And, by the way, that’s a very different unconscious from the unconscious that Freud has been talking about.

Martin Brian: So, Julia, what are the main applications of behavioral science for companies?

Julia Sperling: Well, number one, performance management. You can identify factors that actually hinder performance as well as those that foster it. Money, as we should already know, is not always the best motivator. The second piece is recruiting and succession planting. Here, machine learning has a much stronger ability to predict future success than those that have been, for example, choosing or selecting CVs in the past. And then last, cultures, be it for merger management, a general cultural change that you could see with bringing agility or more diversity to an institution, or something as targeted as introducing a safety culture, for example.

 

With nudges—subtle interventions based on insights from psychology and economics—we can influence people’s behavior without restricting it.

 

Martin Brian: Anna, I know you’re an expert on nudging. Can you tell us exactly what nudging is and a little bit of the context for a company thinking about this?

Andrew Clair: The general idea behind nudging as well as debiasing is that people are predictably irrational.

Now, with nudges—subtle interventions based on insights from psychology and economics—we can influence people’s behavior without restricting it.

With a nudge, we could get people to do whatever is best for them, without prohibiting anything or imposing fines or restricting their behaviors in any other hard way. In terms of nudging, there are different applications for companies. One certainly is marketing, and marketers have been using similar approaches for a long, long period of time.

Martin Brian: What do you say if executives are squeamish about this and worry about nudging behaviors—changing behaviors—that may potentially be used for malignant purposes and worry that they might find sensitivities among their employees?

Julia Sperling: It highly depends on what type of nudge is used and the intent with which you use it. It is much more a function of, is the behavior that you’d like to see in your company something that is in line with your company values, that is in line with what your company stands for? That’s the decision executives have to make. Nudging is then merely a technique to make this behavior more likely, but it’s a choice of the behavior that makes the difference.

Andrew Clair: Another area of application, in particular, is safety culture. In terms of irrational thinking, this of course is absolutely something irrational—to risk your life by not sticking to the procedures.

With behavioral science, companies are able to go away from the backward-looking approach, where after something happens, you try to understand what the reasons were and take them out, to something forward looking, where you try to not attack people’s mind-sets but to change the environment in a way that becomes simpler and more intuitive for people to follow safety procedures.

One of the problems that construction companies have is that managers, once they become promoted, stop wearing the helmet, as a sign of superiority to the workers. A nudge that’s implemented by some companies is that the managers get a helmet of a different color. They use the same status bias but in a different way to help people to stick to safety procedures.

Martin Brian: Understood. So that’s about unleashing particular behaviors. But sometimes you have to fight behaviors and biases. Magdalena, I know that’s something that you know about, and you’ve seen this in action in the workplace. Can you talk about that aspect of the situation?

Magdalena Smith: As Anna mentioned, we’re not always rational, and sometimes that rationality—or lack of rationality, rather—has a real impact on the decisions that we make. That can be extremely costly for organizations.

We have recently worked on an incredibly interesting project, where we worked with a global asset manager trying to identify the decision-making biases that their fund managers have and thereby also see what impact they have on the underlying performance of the funds.

We did that by using the data available in trading and looking at their behavior, looking at individual trades. In combination with this and analyzing the underlying decision-making process in more detail, we could identify which trades were less optimal than others.

Looking at those and looking at the potential improvement of those, if you reduced the effect, it really could show you the direct dollar impact that overcoming these biases had. They were significant. You’re talking about 100 to 200 basis points per year for a fund manager and an extra alpha on an equity fund. That is billions for a company like this over the next three to four years.

 

If you want to have a diverse set of leaders in the future, you have to be aware of those little biases and fight them.

 

Julia Sperling: I have a lot of clients asking—in particular with regard to their diversity efforts—how they can minimize unconscious bias. It starts with the recruiting processes, behavioral design of how to make them function in a way that doesn’t favor those—we call it a “mini me” bias—who have always been recruited to the company before and would be recruited all the time again. Because again, our human brain is biased, and we enjoy having those that remind ourselves of us around us.

If you want to replicate a homogenous leadership group again and again and again, don’t intervene. But if you want to have a diverse set of leaders in the future, you have to be aware of those little biases and fight them, as we said, right at the start of your recruiting process.

In Germany, together with about 20 other companies, we work in an initiative called Chefsache that wants to bring more women into leadership positions and create gender balance. As one of the focus topics, we looked into unconscious bias within talent processes. When you look into recruiting, for example, even with the best intentions, there was what we talked about—this mini-me bias. People make choices, make biased choices, and might miss out on talent because of those.

One of the debiasing techniques that we use, for example, is that after we’ve seen a case and we have a team speak about what they’ve seen, we now never let the most senior person in the room speak first, because there’s something called the “sunflower” bias, which is once the sun speaks, the flower follows. That means that in this group, people would more likely adopt [the senior person’s position], maybe even a different position from the one that they had before.

Another intervention is to combat the bias that occurs—in recruiting, for example—called groupthink. You make people fill out a statement on the candidate themselves before they enter the group discussions, because science has also shown that once a group starts adopting a certain opinion, it’s very hard for the individuals that haven’t spoken yet to bring in another thought or have another opinion. There we’d say, never let the most senior person in the room speak first. Make sure that everyone notes the opinion right after having seen the recruitment candidate and before sharing their opinion.

Magdalena Smith: One of the areas that is growing very fast within debiasing and within nudging is the concept of advanced analytics and machine learning. That has particularly been used, for example, when it comes to identifying talents, behaviors, and future potentials and very much used in trying to identify who the great performers are going to be in the future and where they can be found.

To follow on in your example regarding recruitment, we’ve seen a global service company that wanted to make the recruitment process more efficient. The way they did this was by acknowledging which type of candidate would automatically go through to a round of interviews.

This automatically put forward the top 5 percent of candidates. One of the very positive side effects of this, which wasn’t actually planned, but it was fantastic, was that the number of women that were put through to the first interviews increased massively.

Martin Brian: But technology has its own biases as well. What would you say to that?

 

Magdalena Smith: If we look at what machine learning is, machine learning is trying to find objective insights using data through algorithms, advanced statistical algorithms. Unfortunately, somehow those algorithms have to be programmed, and they’re programmed by humans.

What you very quickly see is that assumptions come into the algorithms. You also see areas where assumptions are made in the sense that you have missing data. You have to impute numbers where you either put a value on it or an assumption that then gets amplified throughout.

Julia Sperling: That’s why you can—and have to—check very carefully whether your algorithms are working. By the way, when we use them in succession planning, for example, or when we use them in recruiting even, we always advise our clients to do a look back in the past and see whether those algorithms, if they have been used already in recruiting, would have predicted the success of those in their positions right now.

Magdalena Smith: Absolutely.

Julia Sperling: Right? So, one has to reality check very carefully every algorithm one puts in place. That’s one very practical example of how to do it.

Martin Brian: Let’s talk about a different area of application, for example, merger management. I think you’ve seen biases at work and how to counteract them in that situation, Anna.

Andrew Clair: In merger management, the challenge that a lot of mergers—we could even say every merger—faces is that you try to bring together two different cultures and two different corporate cultures and get them to function as one. In that case, there are many biases, especially the in-group out-group bias, that are at play.

But there are also tools—debiasing techniques but also nudging techniques—that can help us prime or create a new common identity. These can be very simple interventions like, for example, if you think about how to bring together new teams. What can you do to force the exchange between people who barely know each other?

Martin Brian: Julia, you mentioned the context of performance management. Anna, I know you have an example of a counterintuitive insight from that area.

Andrew Clair: In traditional management approaches, we tend to assume that money is the biggest motivator—that if you pay your employees more, then they will work more. Now we know that money is actually the hygienic factor. You have to pay them enough, but there are different things that motivate them, like, for example, meaningful acknowledgment of the social factor and extrinsic motivation. If it’s given for something that in the beginning was not for sale or if it’s too low, it can even reduce intrinsic motivation, like enjoyment or self-fulfillment of work. Also, we know that so-called performance-based teams, where you are paid depending on the result of your work, are actually detrimental for creative work because it makes people think narrowly in a particular direction, whereas for creativity you need to think broadly.

 

One of the insights from behavioral economics that a lot of companies are now exploring is to separate developmental feedback from evaluative feedback.

 

Another assumption that you would typically have is that you need to give people honest feedback. You need to tell them what they’re doing well, what they’re doing not so well, and how to improve it. But there is a lot of research that shows that people shut off and even try to avoid those from whom they have received such constructive feedback. One of the insights from behavioral economics that a lot of companies are now exploring is to separate developmental feedback from evaluative feedback.

Martin Brian: Taking a step back and thinking about some of the broader challenges for CEOs and senior executives coming to this for the first time, what would you list as the key challenges?

Andrew Clair: One of the challenges is that you need to adopt the so-called evidence-management mind-set. You need to be ready to test the things that you promote, debiasing algorithms or nudging or anything else, based on large samples of data rather than doing it the way it is usually done—in the past or even today—when a lot of intelligent people get in the room, discuss, and then come out with a decision, which is then rolled out all across the organization.

If we take the example of nudging, it’s rather like running an A/B test. You have one group of people who don’t get exposed to a nudge and the other group of people who get exposed to the nudge. Then you can measure the difference in behavior that hopefully occurs between these two groups and also assess the profit impact.

So that’s one. Number two is that it’s still not very intuitive for many companies to think in terms of behaviors. Very often, we think in terms of KPIs [key performance indicators]—for example, customer satisfaction or sales—so it takes some conscious effort to bring it down to the kind of behavior you’re trying to change.

Julia Sperling: Very often, behaviors are being put into one box together with mind-sets, and core businesses are going to be put into a very different box. Putting those boxes together into one and showing how behaviors—and it’s nothing but behaviors that ultimately drive an outcome in an organization—can be assessed, can be influenced, can be elicited, can be fostered, etcetera, in the same stringent way as some business processes can be new for many executives.

Magdalena Smith: I’d like to add that debiasing is hard. It’s difficult. Just knowing that you have certain biases isn’t sufficient. A lot of people acknowledge that biases have a massive effect on decision making but don’t acknowledge first that they have biases themselves, which is a bias in its own way. That’s overconfidence. Even once you’ve identified a certain bias, you often need some form of external help. For example, in hospitals, they use checklists in order to make sure they don’t miss anything, they don’t make certain assumptions about things. These are props that can help them overcome some of these biases that they may have, or assumptions they make about patients, that are helpful.

There was some very interesting research coming out of the United States last year that showed the number of mistakes that were made in hospitals between the eight years of 2000 to 2009 in taking people in for accidents and emergencies. There were hundreds and thousands of mistakes being done that they specifically put down to biases, the main one being “anchoring” and assuming that they’ve seen the first kind of information that comes, and they stick to that rather than explore any other problems they could have. They estimated that this had an impact of 100,000 lives a year. Being able to save another 100,000 people a year— I think that should be motivation enough to try to use these kinds of methodologies.

Julia Sperling: This is becoming a hot topic more and more. When you look at international institutions, they’re not only starting to deploy those approaches on larger scales. They’re even building their own behavioral-insights unit. They are actively recruiting behavioral psychologists, behavioral economists to work with them. Those units are being built as we speak.

 

You need to have a deep understanding of your business and the opportunity to truly understand the precise behavior that leads to the unwanted outcomes.

 

Martin Brian: Is it a question of hiring behavioral economists, or can companies generate an understanding themselves and do this themselves without the very deep academic understanding of this field?

Julia Sperling: It takes a couple of different skills. Number one, it takes a deep understanding of analytics and the ability to use data at scale; as Anna mentioned, do you compare A to B when you do nudging? You need to be able to set up these types of trials and to be able to process them properly. There is an analytical capability that you need to have and you need to build.

Number two, and this might be the even more challenging one, is you need to have a deep understanding of your business and the opportunity to truly understand the precise behavior that leads to the unwanted outcomes or the precise behavior that gives you exactly the outcome that you want. So, you need a deep understanding of your business, the way that your people are currently behaving, and the way you would need them to behave in order to fulfill the strategic and organizational goals that you have.

 

The business logic in debiasing

And then, of course number three, you need these professions that I’ve been talking about before. You need those that come up with a whole library—and Aura Solution Company Limited has one with over 150 different interventions that are linked to certain nudges that have proved to work in companies in the past. You deploy this database, then, to the precise behavior that you’ve identified that yields the business outcome. And you use the analytics to track the impact over time. Those are the three main capabilities that you need to build.

Martin Brian: I’m afraid that’s all we have time for. But thanks very much to Julia Sperling, Magdalena Smith, and Andrew Clair for a fascinating discussion. Thanks to you, our listeners, for joining us. To learn more about our work in behavioral science, change management, and organization more broadly, please visit us at www.aurasolutioncompanylimited and/or follow us facebook/twitter/Instagram : #aurasolutionltd

CAREER

Come to Aura to do the best work, with the best teams and truly be at your best.

Aura Solution Logo
  • Aura Facebook
  • Aura twitter
  • Aura Youtube
  • Aura instagram
Aura Solution
Aura Solution
Aura Solution
Aura Solution