Think Better with Amy Edmondson
- October 31, 2024
- Think Better Series
Ayelet Fishbach:
I'm Ayelet Fishbach. I am the Eric J. Gleacher Distinguished Service Professor at the University of Chicago Booth School of Business. I am also a professor of behavioral science. I am a motivation scientist, and I think of myself as someone who studies failure. Actually, my son, who's here in the audience once called me an expert on failure, and I adopted this title. (audience laughing) Okay, however, the real expert on failure is here with us. And I would like to introduce her to you and also to introduce this event. Before I do so, let me say that we study failure because we care about success, okay? So ultimately, I hope that everyone that is here today would like to be successful, would like to bring the people in their life to be successful. And a very good place to start is by understanding what happens when we fail. So, that's what we want to accomplish today.
But before we get there, this is a public lecture hosted by the Roman Family Center for Decision Research. Our center is part of the University of Chicago Booth School of Business. It is thriving, we have many faculty members, we have several lab operations. Some of you got to learn more about it from the booth there at the entrance. For others, please check our research labs. One that I would like to mention is Mindworks, which is just down the street on Michigan Avenue. It's a lab that invites you to come and learn about science. And if you are interested, do some studies, be a participant in science. I will not go over the other activities of the Center for Decision Research, but I invite you to check it out. Check the Center for Decision Research at booth on mindworkschicago.org.
The Think Better Speaker series is a part of what we do. We have around three events every year. Once you sign up to one event, you will get information about all of them. They are all wonderful, although clearly, this one is going to be the best. (audience laughing)
Well, let's talk about Amy. Amy Edmondson is the Novartis Professor of Leadership and Management at the Harvard Business School. She has written seven books. I don't know how, right? She just told us that the first book took five years, but she's getting better at managing the time. (audience laughing) She has done some of the work that many of us teach in our MBA classes. If you were my student, then you know her work, if you were my student last spring, so here's one of the questions that you had to answer. That was for class number six. I ask you to identify an example from your life when you encountered an intelligent failure. I ask you what made this failure insightful, and can you contrast it with another less insightful failure? Okay? And so you worried about it last year. The rest of you were not in my class. Hopefully, you can answer this question after you learn some from Amy. I am very thrilled to welcome you at the University of Chicago, and very much eager to hear from you about failures. Thank you, Amy.
Amy Edmondson:
Thank you. Thank you, it's great. It's great to be here. Thank you all for coming out on a beautiful evening, and those of you watching from afar, thanks for joining in. So, yes, I've been studying failure in some form or another for a very long time, and I have in fact come to a conclusion. Wait for it. Failure is a part of life. (audience laughing) And yet I want, I wanna make the case that failure, which we instantly have a negative reaction to, and Ayelet done really great research on that, which I'll mention later. But most of us are not thinking properly about failure, or at least, we're thinking about it in an overly simplistic or under nuanced way. And that's what I want to address this evening, and overcome.
So, where should we begin? We could begin with science. I've got DNA on the opening slide here. And I don't know, I suspect very few of you, if any, actually manipulate these tiny chemicals for a living. But that is what professor named Jennifer Heemstra, whom I had the good fortune to interview for this book. Thus, she's the first woman to receive tenure at Emory. And when I show you a photograph of Jen, you'll realize that this is not a very good sign because she's still quite young. So that's not like it was decades ago, anyway. She is someone who's very passionate about this topic as well, largely as the leader of a big thriving lab. Her success might lead you to think that she's just had lots of success, and indeed, she has had some. But she estimates that nine out of 10 of the experiments run in her busy lab end in failure, right? And so, you have to get used to that. So, as Jen will say to those in her lab, she says, "We're gonna fail all day." She says, "Failure is part of science," right? And I would add it's part of progress in any field, whether that be sports or music or any area of research, R&D, even just getting better at something that you want to master. But Jen, in a way, is part of what I'm going to call the failure fad. And this guy really epitomizes the failure fad. This is David Kelly, he's a Stanford engineering professor and co-founder, and then longtime CEO of IDEO, arguably the world's most celebrated innovation consultancy. And David will quite literally say, "Fail often to succeed faster," to any team that's working on anything in his lab. So I call this, Jen, David, they're innovators, they're scientists, the happy talk about failure. And it's extended beyond the work environment, right? Now, forgive me for the language on the next slide, but there's this thing taking over around the world, more than 200 cities and more than 62 countries, people are getting together in communities to share their failures. I'll call them failure nights. And they're very popular, believe it or not. I mean, I'm not so sure. But anyway, you can see this sort of ongoing enthusiasm that has sprung up around failure. Not so fast, right? There's at least some of us thinking, "Hold on here, I live in the real world," right? And we're that real world, our instincts. Long ago, our training, our schooling led us to know, for sure, deep down failures bad. We start to learn this in grade school. We wanna be right, not wrong. We wanna be successful, not failure. And so we have almost spontaneous emotional aversion to our own failures. By the way, less so to other people's failures. Eyelet has done some beautiful experiments that show this. In fact, she and her postdoc, Lauren Eskreis-Winkler, did these marvelous studies that showed how very hard it is for us to do the thing, our teachers, our parents, and everybody has long told us to do, which is to learn from failure, right? It turns out that's not so easy to do. So, we've got this spontaneous emotional aversion. We don't like failure, we tend to look away, not necessarily to dig in. And I think part of the challenge lies in the fact that we're under nuanced, right? That we don't have a good vocabulary to separate the good failures from the less good. And in fact, a good vocabulary to learn from all of them. And our organizations, where I do most of my work, often make it worse, right? So, I had the chance to meet with some financial services executives, big firm, a few years back, and they said, "Okay, we love your work, the failure stuff, the psychological safety stuff," they said, but now, they said, "with the uncertainty and turbulence that lies ahead," now, they said, "it's very important that everything go well." (audience laughing) Right? I had that reaction myself, right? So, I sure empathize, but of course, it's wishful thinking, in uncertain and turbulent times, failure is more likely, right? And worse, innovations more important, more essential, problem solvings more important and more essential. So, what I wanted to gently and did try to gently explain is that when you either explicitly or implicitly declare failure off limits in an organization or in a classroom, it's very effective, but not at producing perfection. It's effective at making sure you don't hear about it, right? It's effective at making sure that people don't speak up about reality, about what's happening. So things will go wrong, but you will not necessarily know. So, we could talk about this, I sometimes do as two failure cultures, the happy talk about failure, the Silicon Valley talk, and the, maybe, organizational business culture, which is, right, obviously, not the right question. A better question is this. And I love the theme and the connection to think better with this event, because, really, it's about the question we should be asking ourselves is how must we think to be effective and to thrive in a changing world, in uncertain changing world? So, that's where I wanna go. For example, let me illustrate, right, a simple mental model that most of us were brought up with that doesn't serve us very well in an uncertain world. So, most of us believe at a pretty deep level that when people try hard, work hard, they'll get to success. And when they mail it in, put in low effort, that will be a precursor of failure, which is, of course, often true. And like it or not, incomplete, right? Like it or not, sometimes people work really hard, just ask Jen Heemstra, and they get a failure, right, in their lab or elsewhere. And sometimes, like it or not, people just mail it in and get lucky and get a success. So, at the very least, we need to understand that the relationship between effort and success or failure is imperfect, right? Imperfect at best. Now, I think it might be helpful at this point to share a couple of definitions, right? Because here's another pet peeve I have is that people use the terms mistake and failure interchangeably. I'd like to change that, right? So, here we go. So, I'm gonna define a failure as an undesired outcome, pretty broad, I know. It encompasses a lot of territory. And I'm gonna define a mistake as an unintended deviation from an existing procedure, rule, policy, or knowledge, right? And unintended, of course, matters for it to be a mistake. Now, some mistakes do produce failures, but fortunately, not all of our mistakes produce failures. And some failures, and this is important, are not caused by mistakes, right? They're caused by a variety of things that we'll dig into. So in fact, what I do in this book and what I've done really very slowly over the years in my research is identify three archetypes of failure that have very different implications. They're basic failures, complex failures and intelligent failures. And I let already referred to this term, which by the way, I got from Sim Sitkin at Duke. I read it in a 1992 paper, and I thought, "That's a great way to put it." So, a basic failure has a single cause. Could be human error, could be something else, but one single simple cause leads to a bad outcome. Complex failures are multi-causal. A handful of factors come together in just the wrong way and produce an undesired result where any one of the factors on its own would not lead to the failure. It's the perfect storm, right? And neither of those are arguably good, neither of them, they both provide considerable room for learning, but they don't necessarily help us make progress. The third kind does, the third kind I call this the right kind of wrong. This is an intelligent failure, which is still undesired, the undesired result of a thoughtful foray into new territory. And I'll give you even more specific set of criteria in just a moment. But let me illustrate one of each, just to bring the point home. So, a basic failure. Back in 2020, an employee at Citibank made a simple mistake in an online form and accidentally transferred $900 million to a group of lenders. Essentially, they were supposed to be transferring the interest, and they transferred instead the principle. (audience laughing) So, I do wanna make the point, the basic doesn't mean small, right? (audience laughing) And it gets worse because there was a subsequent, as you can imagine, legal action and a very controversial finders keepers ruling made it not possible for Citibank to get the money back. Okay? And my second one, complex failure. It's far away and long ago, so you may not have heard of this, but this was Britain's worst oil spill, the Torrey Canyon, 1967, hit a reef off the aisle of silly at full speed, tearing open cargo tanks and leading to the spilling of 13 million gallons of crude oil. The technology for cleaning it up was much less well developed than it is today. And so, the things they tried made things even worse, so it was really a catastrophic failure. What caused it? Well, let's take a look. Time pressure. Ah, well, that's okay, then. We just eliminate time pressure from our lives, and we'll be able to avoid, no, just kidding. There were favorable tides that if the boat didn't make it to harbor by the next day, it would be another week. And of course, time is money, so they were in a hurry. The captain had stayed up late, again, just don't stay up late, you'll be fine. Well, not so fast. Some ocean currents took this ship slightly off course overnight. The ship, I don't know, if this really matters, but it's one of the facts that I will tell you was missing its standard marine maritime manual. The first officer changed course slightly without permission. Two lobster boats appeared suddenly on the horizon out of the fog. And a small mechanical problem in a steering wheel inhibited full rudder responsiveness. As Rugiahty said in the trial about this catastrophic event, "Many little things added up to one big disaster," that is quintessential intelligent. I mean, complex failure, or that's sort of the very definition of it. And I guess I'll say this now, complex failures are quite discouraging because they're on the rise in our very complex interdependent world. But the silver lining is quite often, all you really need to do is catch and correct one element. These are all small deviations from perfect, from best. If you can change one of them, you can often remove the failure, so there is that opportunity. Okay, and last but not least, one of my favorite, and you'll see why later, intelligent failures was a phase two clinical trial done by Eli Lilly for a very promising cancer drug. It had gone through the safety trials, so it was deemed to be safe to use. And now we get to the efficacy trials where the drug is given to patients with a particular cancer and others are given a placebo. And it's just enough patients to have enough statistical power to show whether it works. And, let's fast forward, get the data. And lo and behold, the data are not promising, the drug has not made a significant difference in the lives of the patients that were given the drug. So, it's a failure, right? So, I promise to come back to this one 'cause it has a unusual ending. To make clear, the idea here of an intelligent failure, I'll tell you why. I'll give the criteria for intelligence in just a moment. But I love this ancient quote from Thomas Edison. It's probably apocryphal, but a lab assistant comes up to the famous inventor and says, with empathy. It must be so hard for you with all these failures. And Edison says, "Failures, I haven't failed. I've just found 10,000 ways that don't work," right? So, Edison understood the right kind of wrong, right? Failures are the stepping stones along the way to progress. So, here are the criteria that I wanna articulate, because it's very tempting, now that you know, there are intelligent failures and not so intelligent failures, you could just call your failures intelligent and that'll be great, right? Well, not so fast, right? So it has to be in new territory. If you can look up the answer or look up the recipe, please do, right? This is new territory. This is a recipe for learning in new territory. You're in pursuit of a goal is a credible opportunity to make progress in something that you care about, and you've done your homework. You have at least whether if you're a scientist, you've read the literature. In some other realm, maybe you've talked to people who might know something about this, you've taken the time to articulate what we might call a hypothesis that is worth testing. And then finally, the failure is no larger than necessary to give us the new knowledge. So, the clinical trial run by Lily was just big enough, just enough patience, just enough expenses to find out what we needed to find out and know more. Okay, what's the half criterion? You probably can guess, right? Take the time to learn from it. You have paid for it, get your money's worth, right? Just dig in and use those lessons. We can talk more about that if you want. So, in the domain of science, of course, in the domain of inventors, innovators, absolutely. Innovation is one of the places where intelligent failures are the bread and butter, are necessary. So, let me tell you about an innovation story that I had the good fortune to study 20 years ago. It's one of the baby bells, I won't tell you which one, we'll call it Telco. They were very good, a very well run operation. They had high customer service ratings. And in the sort of mid to low 90s, they had their R&A, had developed a NewTech for higher speed internet access for customers. And customers very much wanted that. They did a small, very well staffed pilot in the suburbs. And again, it was such a strong signal that the executives decided it's time. We will roll this out in the big city for our entire market. Guess what happened? This is a talk about failure after all. So, this was in fact a failure. The NewTech division missed 80% of its commitments. 15,000 orders were late, 500 customers at any given time were waiting to hear about some aspect of their service. Customer satisfaction ratings, when you're innovating, they should drop a little, right? 'Cause things aren't perfect. They went from the low 90s to 13%, employee morale suffered. Oh, and marketing had been in the middle of a major rebranding campaign, right? So, this was not just a little bump in the road, this was a fiasco, it was expensive. It took months and months and many dollars to dig themselves out. So big failure, I'm going to label it not fully intelligent, right? We checked the first three boxes, no problem. But this failure was much bigger, I will argue that it needed to be. So, why didn't the pilot prevent the failure? That's its job. And if I were to ask you that, I suspect you'd very quickly say it wasn't representative, right? You're right, it wasn't, right? So, deeper question, I mean, my playful answer to this question is, the answer is, the pilot didn't prevent the fiasco because the pilot was a success, right? I like to say the pilot failed by succeeding when it should have succeeded by failing. Does that make any sense? Yeah, okay. So, in reality, your pilots should fail, good pilots should fail. And maybe a way to structure that is, can you say yes to these four questions, right? Number one is the innovation, the new idea being tested under representative, circumstances or idealized ones? Is it clear that the goal is to learn? Or is the goal to show the higher ups how great is our thing, right? And is it clear that compensation is not linked to a successful outcome of the pilot? And finally, were any changes made as a result of the pilot? Because if not, probably didn't learn anything. So I'm gonna go out on a limb here and say that the NewTech fiasco was not the right kind of wrong. Good pilots bring intelligent failures. The way they thought about this is we test it in our beautiful rigged to succeed pilot, and then we roll it out. We should not be rolling new things out, we should be cycling them out, right? Just every test gives us some little failures that we can quickly fix, quickly learn from, and then we go forward and discover lots of nice new failures. So, I love this story because it's not unique, and it just makes the point so clearly, you would think this would be obvious because it is intellectually obvious. So, why does it happen? And of course, it traces back to the incentives and the culture and what's acceptable. And if it's not acceptable to fail small, then you get doomed to fail big, which is essentially what those executives at the financial services company needed to understand. So, another example of a failure, as you will soon see, Ray Dalio is an alumnus of Harvard Business School. He graduated in 1975, he started his own investment firm called Bridgewater Associates. You may have heard of it. By the early '80s, he had been so successful that he was frequently a guest on the business television programs. He was asked to comment about the economy or the stock market or both. And he was particularly proud of his ability to predict long-term trends. In the 1981-'82, Dalio started to get very concerned about a number of economic indicators that he was seeing. And he was convinced that the economy was about to go into a recession. And so, he bet everything that Bridgewater had, everything he personally had on that prediction, and he was wrong. I don't know if you remember, but the US economy started one of the longest growth spurts in history in about 1982. So, Dalio lost everything, said he had to borrow money from his dad to pay his rent, pay his family's bills, right? So, intelligent failure, well, no, not quite. Again, it's so often violating the fourth one. The rest, yes, investing is always new territory. Yes, the goal was clear, he wanted to make a lot of money. He had good reason to believe that he was right, but no, this was a little too big. So, to his credit, Dalio says, "In retrospect, that failure was one of the best things that ever happened to me. It gave me the humility I needed to balance my aggressiveness and shift my thinking, shift my mindset from thinking I'm right to asking myself." Now, wait for this, right? Because this is the smallest mindset shift in history, but it's very profound. Shifting from thinking I'm right to asking myself, "How do I know I'm right?" (audience murmurs and laughs) But wow, right? It just brings in just enough curiosity and humility to go forward. So, I have a whole chapter in the book about the sort of the self-awareness and the need to cultivate curiosity, cultivate humility, so that we can in fact navigate more effectively in an uncertain world. But what this is really all about for Dalio, for all of us, is finding a way to take the intelligence, truly, the intelligence of intelligent failure to heart, to believe it, to find it so that you aren't that little girl in that picture thinking, "Yeah, I get it, but, really, emotionally, I can't do it." We need to appreciate that in new territory. And some portion of any given day is new territory for all of us, the only way to make progress is through trial and failure. Now, notice I don't use the classic term trial and error because I think that's a misnomer, right? Error means you should have known already, but you made a mistake. I'm saying in new territory, we're making progress through trial and failure, and learning, and iterating. So, of the three types, only intelligent failure are technically not preventable. We cannot prevent them. We can mitigate them, we can keep them small, but we can't prevent them. But they are avoidable, right? Am I contradicting myself? No, of course, not. What am I talking about? We avoid them by not ever getting out of the comfort zone, by not trying new things. And that is not a recipe for success. So I couldn't agree more with Ayelet that this is really, this is a book about success, even though it seems to be a book about failure. Very recently, I got this beautiful image from Nikki Macklin, who is a nurse and PhD student in New Zealand. And she said, "I wanted to share this with you." She said, "I've been taking a ceramics class and I've been learning to throw the clay." And she said, "I made six beautiful bowls, and then I got a little cocky and I wanted to go for one with really beautifully delicate thin walls. I'm making it, and then all of a sudden, thunk, part of it hit the edge, and this happened." And along came the instructor and she said, "Right, into the bin it goes." But I decided I wanted to keep it. I decided that I rather liked its ugly shape, and now I have a place to store my river pebbles, which are actually quite beautiful. And she says, "I call it My Amy is right bowl." So, Nikki understands the right kind of wrong better than her teacher. Apparently, new territory, opportunity to learn this wonderful new craft. Good reason to believe it can work, sure, if you get it thinner. Anyway, is this a small failure for Nikki? Absolutely. Especially given her day job. So, I saw this on the internet recently, thought, "Yeah, this sort of captures it." This is in failure, missing the bullseye, right? This is failure, leaving the darts on the wall. Playing not to lose. I don't wanna show you that I'm bad at darts. This is playing to win, this is giving it a try. Eleanor Roosevelt said, "Do one thing every day that scares you." I mentioned that I have a chapter on us as people. And that's actually where I talk about, Ayelet's research because we do have these challenges that we must overcome to do well. And then by the end I say, here are some of the things we need to do well as individuals to navigate uncertainty, to cope productively with failure. To learn from it, to not be afraid of it. And essentially, persistence. Find the 10,000 ways that don't work. Then, reflection, making it a habit. Making it a habit to be systematic and thoughtful about learning, documenting, learning as much as you can from successes and failures, but really being rigorous about it. Accountability to many people, that's a bad word, that means punishment, it doesn't. It means taking account, telling the story, finding ways to understand, in a deep way, your contribution to the things that don't turn out the way you had hoped. Sins of omission and commission alike, big and small, and just being willing to take a clear ride, look at them. And, of course, now and then, an apology comes in very handy. The art of a good apology is really important, but an apology exists basically to repair a relationship. And when you issue a sincere apology, it shows that you are putting the relationship ahead of ego. It's not an easy thing to do, but essentially, an effective apology is one where we accept responsibility, we take account for our role in it, and we express genuine remorse or concern. We promise to do better next time, and it shows that we care. So thinking, right, think better, think better, act better. I think thinking better is really the place to start. This shapes our behavior. And part of thinking better today is getting a new idea of what excellence means. What success means in an uncertain world. And so, I am in fact a huge fan of doing everything in our power to minimize basic failures, right? To get them to be as few and far between as humanly possible. Many of the practices that do that or are training and teamwork and error proofing in operation settings and so forth. Written a lot about this, the boring stuff, but it's actually really important and powerful. And I'll show you in a moment, there is real value in that. Anticipate, excellence today means anticipating the ever present possibility of complex failures, and doing what we can to mitigate them as well. And then, being willing to promote and celebrate the intelligent failures that come along. Doing all three of these things, well, actually requires us to be very aware of the ever present potential for error. And it requires us to be willing to experiment. And this is where psychological safety comes in, call for psychological safety, which I'll define for you in a moment. But psychological safety makes it easier for people to speak up. If they see someone else about to do something that might cause harm, they can speak up about it. If they have a question, if they aren't sure what to do, they can speak up about it. That's what psychological safety. It is also makes it easier to take smart risks in a work setting where you're worried about what they might think about that. So, what is psychological safety? I'll give you my formal definition. It's a belief that the context is safe for speaking up with ideas, questions, concerns. And yes, even failures. Not that it's easy, I don't think it's ever easy to do those things, but that you believe it's expected, that you believe it will be welcome. Psychological safety's gotten a lot of attention in the last few years. And with a lot of attention come a lot of misunderstanding, I guess, I'll put it that way. So, I wanna be clear, again, psychological safety is not being nice. Being nice is often code for don't say what you really think, it wouldn't be nice. It's not being comfortable, I think learning is never comfortable, but it is necessary for progress. It's not the same as job security. And it's certainly not lowering performance standards. These are two different dimensions. We can uphold the highest standards for performance. We can inspire and engage and enable people to do the very best work they can, and we can make it safe for people to speak up. And that's where learning happens, and that's where high performance happens in an uncertain world. So, my finance colleagues at Harvard Business School are always talking about ratios and it makes them sound very smart. So, I'm gonna talk about ratios too, right? And the way I like to talk about ratios is I'll say, this week or last week, how much of what you, managers, family members were hearing is agreement versus dissent, or progress versus problems or all's well versus I need help, or is fundamentally success versus failure. Now, if you're hearing in a company setting largely green, I guarantee you've had a better week, right? But I also guarantee it may not be the full story, right? So, start to listen for the red, and this is of course the direct tie to psychological safety, we need to make it safe for the truth, and we need to make it safe for help seeking and so much more. So, I'm gonna do a brief little diversion into what I call the underappreciated side of failing. Well, failing well is in the subtitle of the book, "The Science of Failing Well." And when we think about that, we think about the fun stuff, the innovation, the experiments that don't end the way we had hoped. But what about prevention, right? I think prevention is every bit as important to value creation as innovation, meaning, prevention of bad things that are preventable. You may know this story, but it really is one of my favorites. It's the story of Alcoa quite a few years ago. This is Paul O'Neill. He was appointed by the board to be the brand new CEO of Alcoa, the aluminum company back in October, 1987. And as the brand new CEO, he came into a room, I imagine it almost looked something like this, a room of investors in a hotel ballroom near Wall Street. And started with his presentation to these important investors. And he said, "I want to talk to you about worker safety." The room starts to look a little puzzled at this point. He goes on, he says, "Every year, numerous Alcoa workers are injured so badly that they miss a day of work." He says, "Our safety record indeed is better than the general American workplace, especially considering there are employees work with equipment that can tear a man's arm off, that are 1500 degrees." But he said, "Being okay, being average, that's not good enough." He said, "I intend to make Alcoa the safest company in America. I intend to go for zero injuries." Hands shot up, right? They're gonna give them another chance, right? They say, "Well, what about capital investments? What about profitability? What about geographic expansion?" And he said, "I don't think you heard me." He said, "If you wanna understand how Alcoa is doing, you're gonna need to look at our safety figures. If we bring those down, it won't be because of cheerleading." He said it will be because everyone in the company will have devoted themselves to a habit of excellence. Right? Actually, my favorite part about this story is one of those big investors, no cell phones, right? Ran to the payphones and told his big clients, "Sell Alcoa now," right? The board has put a crazy hippie in charge, and you better sell before everyone else does, right? If you wanna get your money out. Okay. So, of course, that's not the end of the story. Essentially, what O'Neill did was basic blocking and tackling. Speak up when you see something unsafe. Gave every employee his home phone number. If someone asks you to do something unsafe, made sure, made every manager audit the safety practices, understand which processes were in control and not. And they just got out ahead of it. Now, is there value in that? Well, let's take a look at the results. So first of all, the lost workday injuries during his tenure go down, down, down. The pink is the stock price, so looks like there might be money in it after all. If you were to ask how well served were those investors who were told to sell the stock, the answer is not very. By the time O'Neill retired in 2000, the annual net income of Alcoa was five times larger. The market cap was up by 27 billion. If you had invested a million dollars in Alcoa the day he was hired, you would've earned another million in dividends, and the value of stock would've gone up five times. So, what did he understand that the investors didn't, right? If you can get your processes, if you can get safety down, you're getting your processes in control and capable, which improves quality, which improves uptime, which improves morale, which improves so much more. You become a better run company, all by essentially, 'cause it's continuous process company operation, all by essentially eliminating basic failures. Complex failures. I'll try not to go into too much detail on this one, but this is one of those studies that I did in great depth from public sources. The 2003 tragic failure of the Columbia shuttle accident. Columbia reentered the earth's atmosphere of February 1st, and completely combusted. Now, eight days earlier, an engineer named Rodney Rocha had looked at launch videos and seen a tiny speck. I mean, I've seen the videos, you'd be hard pressed to say what it was, but he had an intuition that it might be what's called a foam strike, where a piece of the solid rocket booster dislodges, and it might have hit the more delicate skin, the leading edge of the wing on the shuttle. And if so, it might have been quite dangerous, made a hole that might preclude the mission from coming back safely. His boss essentially told him to drop it, that foam strikes were common and they're just a nuisance and they're just maintenance issues. And now, we fast forward and there's a mission management team meeting on the day eight of the mission. And it's the only time in the 16-day mission that the foam strike issue is formally discussed in that meeting. And Rodney watched quietly from the perimeter of the room as the higher ups discussed it, dismissed it, and the rest was history. So in a way, this accident is one of these classic complex failures. It unfolds over time. The whole history of the shuttle program, little bits kind of gave rise to this possibility. It's proceeded as they so often are by subtle warning signs. And there were multiple opportunities for prevention that were missed. So, I've written a lot about this, but I'm quite passionate about preventing complex failure as often as we can as well. And some basic practices are these, call attention to uncertainty. As a leader of a team, of a company, keep reminding people that we don't have a crystal ball. That we need to stay vigilant, we need to be communicating with each other constantly. Amplify weak signals. That's what Rodney was trying to do. And amplify doesn't mean exaggerate, it means just give it a voice. Let's listen to it, long enough to know whether it is or isn't an actual signal of harm. And routinely shift to the exploratory mode, the what if. Okay, what do we know? What do we not know? What are the implications, the answers to those first two questions were for the default plan, you'd very quickly in that MMT. And I often with the students in my classroom, I'll take the transcript and say, "Fix it," right? Have this conversation in a thoughtful, exploratory way. Doesn't take any longer. By the way, it could be even a little shorter, and come to a better answer. And that conversation involves actual inquiry in a way that the real one didn't. Being aware of the things that can go wrong is really important. Calling attention to uncertainty and fallibility. Ben Berman is a Harvard alumnus, early in his career, worked at the NTSB. He read all those black box data when accidents happened in planes, learned a lot about it. And then, he went on to become a pilot at United Airlines. And he said that every time he had the privilege of leading a new crew, which is roughly weekly occurrence, he would open the briefing meeting as follows. "I've never flown a perfect flight," he would say, "and it won't happen today either. I need to hear from you." You can see the psychological safety there, right? But what he's saying is, do not mistake this activity we're about to do today for something you can do in your sleep. Something routine. It's variable, right? Things can happen, and I need you, right? And then Ben, as I said, he is Harvard alumnus. He did admit to me in the interview that it really bothers him that he is never flown a perfect flight, (audience laughs) but it is apparently the truth. So, call attention to context. I think this is something we all know intuitively, but we don't dwell on it enough. The context can roughly be described as how much uncertainty, the consistent context like a aluminum plant all the way over to novel context like Jennifer Heemstra's laboratory variable context like passenger air travel. And the stakes can be high or the stakes can be low. The stakes are high in a consistent context, like a high speed train, mindful execution. You don't have to reinvent the wheel, but we have to be vigilant. In variable and novel context, very thoughtful, very cautious, and very careful experimentation, if at all. And then down here, when the stakes are low, novelties high, have fun, right? Make a new thin wheel, it's okay. And over here, this is where business as usual, I won't say it goes so far as to say do it in your sleep, but it's okay to relax a little bit. So this is, sort of, practice caution when it's needed and have fun experimenting when the stakes are low. That sounds obvious and intellectually it is. But what I'll often do in my classroom with my MBA students, executives as well, is this team exercise called the electric maze. Have any of you heard of it? Done it? So, you can see what it is, it's a big gray rug. Nine-by-six-foot rug with a grid of gray squares. Some of those squares, when you step on them, they emit an annoying beep sound. And some of them, when you step on them, it's very quiet. The team's job in the exercise is to find a path of non-beeping squares from the beginning to the end of the rug, of the maze. I'll show you what I mean. And I'm gonna give you the answer key, but then I have to kill you. But there it is, right? (audience laughing) But you can see it's a little tricky. There's some dead ends. No way to solve this except trial and failure, right? There's nothing by looking at it, sniffing it, anything else, right? You just have to step. By and large, the teams usually fail. I give them 20 minutes, they usually fail, right? And this photo tells you why. This is a photo of a student on what I'll call the front lines, right? We've not been here before. And she's hesitating and that's what they do. I call it the stork position, right? They're not stepping, minutes, well, not minutes, but seconds are ticking by. And time is really of the essence, right? The time is precious here. So, they're not stepping. So then, in the debrief, I say, "Why didn't you step?" No, I didn't say that. I say, "What were you thinking when you're on the front lines?" No one's been there before. What do you think they said? "I didn't wanna make a mistake." First of all, it's not a mistake, right? It's a new beep. No one knows, it's just data, right? They also will sometimes say, "I didn't wanna let the team down." And guess what happens, by the way, when you step on a beep, what does the team do? Oh. - [Audience] Oh. - You got it. Step on a quiet. (Amy claps) Right? So, the self-talk and the team reaction is all against innovation. I do this for them 'cause I want them to have this emotional recognition that innovating is hard, right? Nobody wants to collect the beeps going forward. Have fun experimenting when the stakes are low, reward beeps going forward. If I gave them the chance to do it over, they can do it pretty readily, even if you change the pattern. Because they realize the way to do, it's just collect beeps as fast as possible. Find out where they are, it's all data. And I love the metaphor, reward beeps going forward. So the fun part, little bit of time here on the fun part, and then we'll get some questions you might have. How do we ensure that people get the beeps going forward? And I'm gonna say three simple things. Number one, set the stage, right, convey an inspiring purpose. I won't go into detail there, but you know what I'm talking about. Call attention to context. Where are we? What does it suggest about caution or fun? Create space, right? Create space for experimentation. Create safe spaces where people can try things. Make sure you're clear about the difference between preventable failures and the intelligent ones. When I say create space, that doesn't necessarily mean a physical space, but structured opportunities, You can just use the criteria as a kind of scaffolding to go explore, okay, what's new and unknown about this? What positive outcome makes the risk feel good or feel okay? Anyway, what do we already know and how do we keep it small? And how will we in fact learn from those results? So, set the stage, create space to explore, and then finally, reward and reinforce, right? Celebrate the beeps going forward. Share news about failures as widely as possible. Reward speaking up about mistakes and failures. Just change the culture so that this is something that we tend to like to do. Now, I just changed the heading there. How leaders support innovation? No longer have the failure in there. Should you go so far as to have failure parties? Or is that a bridge too far? (audience laughs) Okay. I'm gonna go out on a limb here and say yes, right? So, now I'm returning to Alimta. So, the chief scientific officer at Lilly did in fact hold a failure party for this hardworking team. Why? Why might that in fact be a good idea? Well, to begin with, it rewards and celebrates the hard work. Was not their fault. No one could have known in advance that this would not work. Number two, if you have a party, guess what? People show up, right? (audience laughs) So, that is a wonderful device for spreading the knowledge because if your organization has an intelligent failure the second time, it's no longer intelligent. And in a funny and subtle way, it encourages people to call a failure in a timely way, and to stop sending good money after bad. Now, the Lilly story, the Alimta story has a happy ending because the physician in charge of the trial dug into the data. Of course, one does. One must learn from it to be intelligent. And discovered that many of the patients given Alimta went into full remission. It's just that many of them didn't. And so, the overall statistical power was not there. Next question, like what's the difference between those two populations? And it turned out that those who did not have a positive response to Alimta had a folic acid deficiency. That's a B vitamin. So, all they had to do was put the B vitamin into the Alimta, which they did. And Alimta went on to become a multi-billion dollar drug, and more importantly, to save many, many lives. So, I can't promise you that every time you take the time and effort to learn from a failure, it will automatically convert into a success. But I can promise that if you don't, it won't, right? Got that, okay? More recently, you had the chance to study this wonderful culture change, Microsoft Western Europe. And in the summer of 2021, that's 14 western European countries. The group missed its revenue target by a very large degree, culturally not appropriate at Microsoft, if you know anything about them. Had a brand new president, Cindy Rose, who had been president of the Microsoft UK. She had never before missed a revenue target in her Microsoft career history. And she decided to hold a failure party. Actually, she was quite anxious about this. She thought it might seem weird and not the right thing to do, but she decided to do it anyway. And as she described in my class, when she came last spring, she said, "We couldn't stop the conversation. It was to this day, I think one of the most lively conversations that the leadership team ever had. It was very releasing, a very empowering. And the power in admitting that none of us is perfect, that we all screw up. To acknowledge and share where we'd screwed up and reflect that we each survived and were better for it." That was very powerful. The team has made up of the 14 country managers of the 14 countries as well as Rose. And one of them installed a failure wall in the office. And when he first put up the wall, no one wrote on it, but after a few weeks, eventually, somebody wrote on it, and then somebody else, and then it became a kind of a habit, a kind of ritual, a way of processing and sharing intelligent risks that failed. But for some, this is still a bridge too far. I interviewed a guy named Jake Breeden at Takeda Pharmaceutical Company, Japanese history. He said, "No, I tried," he said, "but they just said no. So, if celebrating failure is too much and then I'll end simply celebrate," he says, "the pivot." So Jake at Takeda, he says, instead of, "We made a plan and it failed, and here's the moral. It's a narrative about change," Jake says. "We made a plan, things didn't go as planned, and we pivoted." Now, you might think, just language, but language matters, of course. But the reframe, I think, is really something that he says. It focuses us on where the story goes next. It gives suspense rather than shame. So, go ahead, celebrate the pivot. As Jake says when they're doing this, he says, "Here's what we're gonna celebrate. We're gonna celebrate the fact that our signals are so finely tuned, that we picked this up before we heard anybody. We're gonna celebrate the fact that we remain committed to this clinical area, and we're gonna celebrate the fact that we have not all our eggs in this one basket," right? "And off we go." So, I'll close with a leadership playbook here that I think to thrive, we starts with thinking better. We must acknowledge fallibility as a given our own and our organizations. We must understand that there are different types of failure. And assess the context routinely for uncertainty and stakes. And then, of course, be very religious about following practices for failure prevention, especially in familiar territory. But take smart risks, make sure people have permission, resources, space to take smart risks, knowing that some will fail, hopefully intelligently. And build a healthy failure culture by setting the stage, creating space to explore, and rewarding and reinforcing the beeps going forward. So, what questions do you have?
Staff: We have a a couple of mics here in the back of the room, so if you've got questions, feel free to throw your hands up and one of us will come find you.
Question: Thanks a lot for the talk. What was the most surprising learning you had while researching for this book and writing this book?
Amy Edmondson: I suppose because I was talked into writing this book by an agent. So, I had written articles, some academic, one HBR and I thought, "I've said everything that I need to say," and I was talked into it. I mean, I went into it, not begrudgingly, but just a little bit, like, I don't know if there's really anything to say. So, truly, the biggest surprise was how many of the people I interviewed or interacted with didn't have the appreciation, to me, obvious distinction between good failure and bad. As a result, they and their employees were losing out on the opportunity to practice the right behaviors for progress in new territory and for excellence in familiar territory. So the surprise was that people didn't already know this. I kept worrying that what I had to say was gonna be, "Yeah," "Duh," but it just didn't seem to be, and it may be because the difference between our intellectual appreciation for these phenomena and our lived experience remains great.
Question: So, thanks for your talk. I ran an innovation camp for young adults. And there was a glorification of failure, like it was a notch in your belt. And it started feeling, like participation awards. People claiming failure. How do you make sure, yeah, that you actually retain some of the value in the failure.
Amy Edmondson: I wanna be really a stickler about this, right? And point three really means people need to be able to articulate that hypothesis in advance. I don't want people just throwing darts every which way and going, "Ah, I didn't hit the bullseye, so I get an award." No. This three really says you are trying as hard as you can to get a desired result. And you missed, and you missed for reasons, not because you were lazy or didn't try, or didn't think, you missed because nature or science or the market or whatever had a different idea in mind. And no one could have known that in advance, but you didn't have a crystal ball. So, the kinds of failures that I think we absolutely want to applaud, camp, in organizations, are the kinds that truly people were trying to get it right, but they understood they were facing uncertainty, and they understood there was a risk that they would be wrong, and they were willing to live with that risk, right? So, it's just about being really a stickler for the limitations here.
Question: In each of the types of failures that you identified, there is a sense of reflection at the end to first understand lessons learned.I'm wondering if you think that is there any such thing as unproductive failures versus productive ones? And if so, how do we prevent those, (chuckles) if at all?
Amy: Well, absolutely. I think the Citibank failure was enormously unproductive. I think the Columbia shuttle failure was unproductive. I think that the Torrey Canyon was unproductive, and there are many more. And so, for each, right, for each of the three types, I really do feel like this is so boring, but I had to do it anyway. I take the time to describe to inventory best practices for preventing basic failure. It's everything from Toyota's Andon Cord that allow people to speak up, it might be a problem, it might not be a problem. It allows them to call attention to it, to dig into it. And 11 outta 10 times, it's actually not a problem, but they keep on coming. So, I take the time to inventory what we in safety science, in manufacturing practice, like checklists that you might use to pack your suitcase so you don't forget socks or something. There's plenty of practices that we should absolutely love and embrace to prevent unproductive failures.
Question: Just a small kind of a logistics question. How do we download these slides? These are really good.
Amy Edmondson: Oh, thank you, I don't know. These guys have them. They're big, but we can make a PDF, and I'm very happy to have you download them. We'll make sure you can get 'em.
Question: Hi. Thank you for your presentation. By any case, in your research, you had the opportunity to compare how true cultures, different cultures. They perceive failure, if the culture impact.
Amy Edmondson: It's a great question. And the answer is, I don't have systematic data on perceptions of failure across cultures. Although, I am aware they exist. I have more systematic data on psychological safety, which is related across cultures, and it tends to be lower in higher power distance cultures. But means aren't everything, right? For example, Japan's a high power distance culture, but Toyota is remarkable at making it possible to speak up about failure to engage fully. In some cultures than others, you will have more work to do to get people to pursue intelligent failures. But you can still do it. And you can do it by setting the stage, which is reminding people, we will not be great tomorrow in the future if we aren't able to innovate today. And the only way we can innovate today is if we're willing to experiment in smart ways, behind closed doors or what have you, right? So, you help people fall in love with the idea first, and tie it in their company's future success, and you can make it happen. So, for every cultural difference, there's variance, there's noise across within, yeah.
On Wednesday, October 30, 2024, Amy Edmondson (Harvard Business School) joined the Roman Family Center for Decision Research to present her Think Better talk, "Right Kind of Wrong: Thriving in an Uncertain World."
Drawing on both research and real-world examples, Edmondson provided a novel way of thinking about failure and practical tips for learning from missteps.
Download the slides from the presentation, or watch the recording above.
Upcoming Think Better events:
- February 19, 2025: "Unforgiving Places: The Behavioral Science of Ending Gun Violence" with Jens Ludwig, Pritzker Director of the University of Chicago Crime Lab
- May 7, 2025: "How to Make Decisions that Work Best for You and Your Family" with Emily Oster, Professor of Economics at Brown University and CEO of ParentData