Professor Jane L. Risen helps students navigate common cognitive biases to make decisions as a savvy manager.
- By
- October 10, 2017
- Classroom Experience
In many disciplines—financial accounting, for example—if you try to practice without any sort of formal education, you could very well end up in jail. But when it comes to decision making, everybody is making personal and professional decisions all of the time without any formal guidance. This class is designed to provide that: a framework to actively recognize when decisions are likely to go wrong so that you can identify what you might be able to do to make them better.
We enter the space of, “How do people actually approach decisions?” As it turns out, people don’t typically do so in a systematic or rigorous way. I help students not just recognize and diagnose a decision-making bias, but also tie a so-called “treatment” to that bias. I’m pushing them to ask: What do you think is causing the bias? How would you fix it? How is your solution addressing the problem you’ve identified?
You can’t function without mental shortcuts. There’s no way you can comprehend the world around you at the appropriate speed without being able to rely on general rules of thumb.
For example, answer this question: Are there more words that start with the letter a or the letter x? You can probably think of a lot more words that start with a, so you’d say “a,” which is correct. You just used the availability heuristic—the idea that things that come to mind more quickly are more common.
Now, answer this: Is it more common for the letter r to be the first letter of a word or the third letter of a word? It’s actually the latter, even though you probably said the former. That’s because we sort words by first letter, so words that start with r come to mind more easily. In this case, you can see how the same heuristic led us to a wrong answer. Although heuristics are typically quite useful, they lead to systematic errors and biases.
I hope to impart to students these types of insights—how people naturally and spontaneously process their world. Only when you pause to think about those decision-making behaviors can you start to ask, “Is there anything I can do better than what I’m already doing?”
We tend to latch onto our best guess and fail to see all the ways it could be wrong.
A few cases we study are not business related at all. For example, the first case we examine is a medical decision about whether a person who had an aneurism should get surgery. Students read the perspectives of both the patient and his wife, and then they rate each person’s decision-making process. I graph the students’ answers and they’re all over the map. Some students think both did well, while others believe both did terribly. Some think only one person made a good decision. It starts a conversation in the class, because people have completely different points of view.
In one exercise, students make judgments, and then express how confident they are in their answers with a 90 percent confidence interval. (For example, “I think there’s a 5 percent chance that the real answer is lower than X and a 5 percent chance that the real answer is higher than Y.”) Over 20 questions, two answers should be outside the range. But on average, about 10 answers are outside the range, because our intervals are much too narrow. It’s a classic demonstration of overconfidence: we tend to latch onto our best guess and fail to see all the ways it could be wrong.
Students complete this exercise immediately after we discuss a case in which AOL was overconfident when the company switched their pricing plan. You can learn something from discussing someone else’s overconfidence. But you can learn even more from doing the exercise yourself and realizing, “Oh my gosh, I got all of these wrong.” That’s a much more powerful learning moment.
I hope my students come to recognize that people aren’t always rational, and that —and this is even trickier—they are people. Likewise, this class shows students how they can think just as rigorously about issues that are not quantitative. It’s meant to be applicable to anybody. After all, if you’re going to go out into the world and manage people, it’s nice to know a little something about how we all think.
Kirsten Wellington, ’17: This course led me to reexamine and more thoroughly analyze nearly every major decision I make, both personally and professionally. It encouraged me to question the assumed rationality of human behavior. I’m fascinated that we are not always as logical as we think we are when making decisions. Following this class, I went on to participate in Heather Caruso and Richard Thaler’s Responsible Leadership through Choice Architecture lab course. I applied these behavioral insights to help the Illinois State Treasurer’s office effectively structure and market a retirement savings platform for companies without traditional 401(k) programs.
Laura Worsham, ’17: Professor Risen provided a number of frameworks and tools to help make her students better decision-makers. As a fundamentally left-brain thinker (my pre-Booth role was in accounting), I really appreciated her structured approach to dealing with uncertainty. One such method is through “taking the outside view.” This approach requires historical data, both internal and external, to serve as the basis for forecasts and plans. In taking this approach, the risk of cognitive biases, like over-optimism, is reduced. In my post-Booth role in consulting, there will be countless scenarios when this approach will serve me well.
—As told to LeeAnn Shelton
Pasha Saleh, ’07, shares how Booth’s support helped transform a childhood dream of piloting airplanes into a career innovating at Alaska Airlines.
Learning to Fly HigherIn Interpersonal Dynamics with professor Lisa Stefanac, ’09, students learn to build relationships through open, vulnerable communication.
The Data of FeelingsBooth's new location, perched on historic Mount Davis, features restored heritage buildings.
A New Day in Hong Kong