What’s So Bad About Private Equity?
Chicago Booth’s Steve Kaplan says that private-equity firms frequently invest and grow companies more effectively than other owners.
What’s So Bad About Private Equity?HelloSSTK/Shutterstock
Lying is wrong, right? And yet, almost everyone lies sometimes, and most of us see lying as not only acceptable but preferable in some situations. How does context affect the importance of the truth? On this episode of the Chicago Booth Review Podcast, we examine the findings of research into how lying is perceived.
Hal Weitzman: Do you always tell the truth? If not, does that make you a liar? Is lying ethical? If people find out that you’ve lied, how can it affect your reputation? Is there a difference between lying in your personal life and your professional career? And why is there so much deception in the business world?
Welcome to the Chicago Booth Review podcast, where we bring you groundbreaking academic research in a clear and straightforward way. I’m Hal Weitzman, and in this episode we’re looking at research into the good, the bad and the downright ugly of lying.
We all tell little white lies, and we don’t think they’re harmful. In fact, the little lies that ease interpersonal relations—yes, that haircut looks amazing; you don’t look a day over 30; it’s really not a problem—might actually be beneficial, even ethical. So when exactly does lying cross the boundary from ethical into unethical? We explored that question in 2022 by looking at research conducted by Emma Levine. The story was titled, “Eight Ways in Which Lying Is Seen as Moral.” It was written by CBR contributor Kasandra Brabaw, and it’s read by Julie Granada Honeycutt.
Reader: A friend asks if you liked the soup she made. A colleague asks what you think of his suit. In moments such as these, telling the truth could harm someone’s feelings or self-esteem. Does that make lying seem like the right choice?
Research by Chicago Booth’s Emma Levine, focusing on this question, suggests that for many people, merely sparing someone’s feelings isn’t enough to justify lying. It is only when the truth causes “unnecessary harm” that most people find lying to be ethical.
“Unnecessary harm is a function of how much value the truth has in the long run, whether you can learn and grow from it, and how much emotional pain and suffering it will cost you,” Levine says. If telling the truth will cause someone emotional pain and suffering without leading to growth or long-term value, many think lying is justifiable.
For example, if your colleague in the ill-fitting suit is about to give an important presentation and cannot change first, many people think that answering truthfully would cause unnecessary harm. In situations such as this one, people believe lying is ethical, the research finds. What’s more, people also want to be lied to in these situations. “We think of deception as bad, but yet, we want people to deceive us all the time,” says Levine.
She conducted a series of experiments involving hundreds of participants to understand at a fundamental level how people make moral judgments about honesty and dishonesty. In one study, she gave participants a scenario in which a manager received a list of employees to lay off within the next month due to a company reorganization. When told that one of the employees on the list dropped in on a Friday afternoon for an update about the reorganization, just under 23 percent of participants said it would be acceptable for the manager to lie. But when told that the employee who dropped in was getting married the next day, the proportion endorsing deception more than doubled to 52 percent. In this case, they saw telling the truth—and disrupting the potential bliss of a wedding and honeymoon—as causing unnecessary harm, and therefore saw lying as ethical.
The research identifies eight “community standards of deception,” or situations in which the majority of respondents agreed it was ethical to lie. Many deemed it acceptable to lie to people who were emotionally fragile, near death, or would be confused by the truth. They also found it more ethical to lie when doing so would help others save face in public or concentrate on something important. Lies that were subjective or trivial were also considered in bounds, and those about a situation the recipient was ultimately unable to control.
In a series of vignettes, study participants were more likely to approve of lying the lower the perceived value of telling the truth and the higher its perceived harm.
Participants in the experiments said they would value ethical deception both as the liars and as the people being lied to. In one study, Levine divided participants into three groups: communicators, third-party judges, and targets. No matter how participants were asked to view themselves—as the liar, the lied-to, or separate from the lie—a majority endorsed deception when the truth might cause considerable immediate harm and would have low long-term value. If telling the truth will hurt someone emotionally or physically and won’t encourage learning or growth, why be honest?
“I would want someone to lie to me when the alternative of telling the truth would make me feel worse off and I would have no control over what happens,” wrote one participant. “For example, if my beloved dog died after being hit by a negligent driver, I’d much rather my parents or friends have told me the dog died peacefully in its sleep than to tell me the facts.”
Others explained that they would want people to lie about something that couldn’t be changed, and one person gave the example of asking friends whether they “looked OK” for a night out. If the question was posed from home, “I hope they would tell me the truth, so I could change whatever looked bad (as best I could),” wrote the participant. But if the same person asked the same question when already out, and received an honest but negative response, “my night would be ruined and I would have to stay at the bar knowing I looked bad.”
Levine says that a lot of research in this area, including hers, documents cases where “communicators think it’s OK to lie and the targets don’t agree.” But when a lie clearly involves unnecessary harm, targets and communicators largely agree it’s preferable to the truth, she finds.
Hal Weitzman: So most of us think that lying can be ethical. But more research by Emma Levine and three co-authors might make us think twice. Being seen to tell even the smallest lies can have a negative effect on your reputation, the study suggests. We wrote about it in a 2018 article under the headline, “When Little White Lies Cause Big Hurt.” It was written by CBR contributor Alice Walton and is read by Julie Granada Honeycutt.
Reader: Most people have, at one point or another, told a well-intentioned lie to spare another’s feelings or to bolster someone’s confidence. But even lies that are intended to serve the greater good can backfire, inciting suspicion about the liar’s intentions and morality, according to Deakin University’s Matthew Lupoli, Chicago Booth’s Emma Levine, and Bocconi University’s Adam Eric Greenberg.
The researchers set out to better understand a previously undefined variety of lie, which they dub “paternalistic.” In this type of lie, the deceiver makes a judgment call about the lie’s potential benefit for the recipient. An oncologist who doesn’t actually know a patient’s wishes regarding candor might tell a paternalistic lie by sugarcoating the prognosis, for example. Because of the assumption on the part of the liar, the lie is considered paternalistic rather than unequivocally prosocial—the lie would be the latter if it were known to align with the patient’s preferences. In this example, the doctor’s lie would be unequivocally prosocial only if the patient had previously made clear that the doctor should soften the blow of any unpleasant developments.
Lupoli, Levine, and Greenberg ran a series of experiments in which participants played an experimental game, called a deception game. In this type of game, one player (the sender) has the opportunity to lie to another player (the receiver) in order to achieve a certain outcome for the receiver. In some conditions of the researchers’ study, the senders had an opportunity to tell a paternalistic lie, meaning they had to make an assumption about what type of monetary reward would most benefit the receivers. For example, if a sender told the truth about an unrelated event, such as the outcome of a coin toss, the receiver earned a $10 lottery ticket for that day. If the sender lied, the receiver earned a $30 lottery ticket in three months.
This, says Levine, models a situation in which communicators have to make assumptions about what will most benefit the recipients of their statements—an immediate benefit now or a bigger benefit in the future. A communicator might have to decide whether to give false praise, which provides an immediate benefit, or candid criticism, which is often costly in the near term but beneficial in the long run.
The researchers also included conditions in which the sender had the opportunity to tell an unequivocally prosocial lie, which would clearly benefit the receiver. For example, the sender could lie to earn the receiver two tickets, rather than one, to the $10 lottery. In this situation, no subjective judgment is required because two tickets are clearly preferable to one.
The participants who received paternalistic lies reacted negatively on several levels. They viewed the liars as significantly less moral than those who told the truth, as well as less moral than those who told unequivocally prosocial lies. The researchers also find that paternalistic lies adversely affected recipients’ emotional states and their satisfaction with the outcomes of the lies, and caused them to punish the liars.
The recipients of paternalistic lies tended to assume that the liars had bad intentions, the researchers say. Participants who were told paternalistic lies also tended to feel that the lies reduced their own autonomy, and that the liars generally misunderstood their true feelings and preferences. This was not the case for people told unequivocally prosocial lies.
When liars communicated their good intentions, it did not reliably reduce recipients’ negative feelings about them. For example, in one of a series of vignette studies, participants considered whether it was moral to falsely praise a colleague’s presentation. They deemed it less moral to do so if the colleague hadn’t expressed a preference for comfort over candor than if she had. And belatedly saying “I only meant to help” didn’t improve the liar’s moral standing.
The findings extend to a variety of scenarios, from political to medical, and highlight that good intentions alone don’t justify lying; the lies’ recipients are sensitive to whether liars are acting based on assumptions or true insight into the preferences of the person they’re lying to. Unless you can be sure about another person’s preferences, it may be best to steer clear of even well-intended lies, the researchers conclude. If uncovered, paternalistic lies do more harm than good.
Hal Weitzman: While lying might hurt your personal relationships, research suggest that in some professions, the ability to stretch the truth can be a strong asset. Again, one of the researchers here is Emma Levine, and her work gives us an insight into why there is so much lying in business. It appeared in a 2019 CBR article titled, “Why Some Professions Reward Dishonesty.” It was written by Alice Walton and is read by Julie Granada Honeycutt.
Reader: Most people don’t look favorably upon acts of deception. Research finds that deception elicits all kinds of negative emotions in the perceiver and tends to signal incompetence in the deceiver. Why, then, is deception so prevalent in business and the world in general? And how can managers encourage employees to be more honest?
Johns Hopkins’ Brian C. Gunia and Chicago Booth’s Emma Levine had a hunch that deception might not be viewed so negatively for certain professions, such as sales. In occupations with what the researchers call “high selling-orientation,” defined as the “use of high-pressure persuasion tactics to elicit immediate, self-interested economic transactions,” they argue that deception might actually be seen as a signal of competence.
This occurs, they suggest, because people view deception as a particularly effective high-pressure persuasion tactic. As a result, a salesperson who deceives a customer in order to get the customer to buy a product might be seen as a particularly competent salesperson. What’s more, Gunia and Levine find the perceptual link between deception and selling orientation is so strong that even deception that has nothing to do with selling is seen as a signal of competence. For example, the authors find that employees who lie on their expense reports—an act of deception that is costly for companies—are seen as competent employees in high selling-orientation occupations.
Sales isn’t the only occupation where these skills are regarded as beneficial. The researchers suspected that many professions would be seen as high in selling orientation, and as a result, might reward deception. To figure out the set of occupations to which this association applied, the authors ran a pilot study in which they asked 204 participants to rate each of 32 professions including mechanic, doctor, and lawyer on a scale they devised to measure perceived selling orientation. Respondents placed occupations including salesperson, advertiser, and travel agent at the top for selling orientation, and librarian, machine operator, and chemist at the bottom.
Then the researchers studied whether people view deception differently in occupations stereotyped as high versus low in selling orientation. Online participants read about Julie, an individual on a business trip who exaggerated the cost of a company-reimbursed cab ride by $10. The participants were randomly informed that Julie was either an investment banker, a salesperson, an advertiser, a consultant, a nonprofit employee, or an accountant. Participants rated Julie as more competent when she was from a high sales-oriented occupation such as investment banker or salesperson than when she was from a low sales-oriented profession such as nonprofit employee or accountant, the researchers find.
Another group of online participants read about James, who acted either dishonestly, agreeing with his boating-buff boss that sailing was great, or honestly, admitting he didn’t care for sailing. Participants rated how James would do when changing careers to a high or a low sales-oriented profession. The lying version of James would do better when switching to a high sales-oriented profession than an honest James, the participants said. In other words, in certain occupations (namely sales, advertising, and investment banking), participants believed that deceivers would be more competent employees than honest people.
In a lab setting, Gunia and Levine had participants observe an individual playing the part of the sender in the deception game, an economic game commonly used to measure the use of deception in laboratory studies. The sender has the opportunity to either tell the truth or lie to the receiver, which affects how much money each party will take away. Lying benefits the sender, but harms the receiver. In this study, each participant observed the sender either lying or telling the truth; after that, they had to say how likely they’d be to hire the sender into each of six occupations. Participants were more likely to hire deceptive senders than honest senders into high sales-oriented jobs, but they were more likely to hire honest senders than deceptive senders into low sales-oriented jobs.
Gunia and Levine say their findings may explain why deception is so prevalent in corporations: for those in sales-oriented positions, an inclination to deceive may be associated with competence and thereby be positively reinforced. The researchers suggest that managers who witness deceptive behavior “may wish to publicly admonish it and thus reinforce the need for deception-free competence, potentially supplementing such messages with training in alternative approaches to selling, like customer orientation. . . . This reframing could help to sever the link between deception and competence.”¬
Hal Weitzman: That’s it for this episode of the Chicago Booth Review podcast. It was produced by Josh Stunkel, and I’m Hal Weitzman. If you enjoyed this episode, please subscribe and please do leave us a 5-star review. For more of the latest research, visit us online at chicagobooth.edu/review. Thanks—until next time.
Chicago Booth’s Steve Kaplan says that private-equity firms frequently invest and grow companies more effectively than other owners.
What’s So Bad About Private Equity?Research in China explores the connection between crops and contentedness.
Line of Inquiry: Thomas Talhelm on the Happiness Gap between CulturesThree streaming sagas about tech leaders share a common thread.
Steve Jobs’s Ghost Haunts Silicon ValleyYour Privacy
We want to demonstrate our commitment to your privacy. Please review Chicago Booth's privacy notice, which provides information explaining how and why we collect particular information when you visit our website.