Getting Emotional Could Promote Healthy Eating
People tend to be more persuasive when talking about the appeal of junk food.
Getting Emotional Could Promote Healthy EatingThis discussion was originally filmed on February 19, 2020, and the panelists reunited via Zoom on May 4 to reconsider the issue (see final question).
Hal Weitzman: Welcome to a special episode of The Big Question. We filmed this before the coronavirus hit the United States, triggering a wave of shutdowns and stay-at-home orders. We still think the content is interesting and relevant, perhaps more so than ever, which is why we asked our panelists to come back to consider how our thinking has evolved in light of the crisis. Here’s the episode.
(soft music) There’s a good chance you’re watching this video on a phone or a tablet. How often do you reckon you check that device—every day? Research suggests you’re likely to underestimate the amount of time you use it. Most of us are spending about a quarter of our waking lives on our phones—more than any other activity except sleeping.
So why are we so addicted to technology? What can we do about it? And should there be a public-policy response? Welcome to The Big Question, the video series from Chicago Booth Review. I’m Hal Weitzman, and with me to discuss the issue is an expert panel.
Nicholas Epley is the John T. [Templeton] Keller [Distinguished Service] Professor of Behavioral Science at Chicago Booth and faculty director of the Center for Decision Research. He’s the author of Mindwise: How We Understand What Others Think, Believe, Feel, and Want.
Marshini Chetty is an assistant professor of computer science at the University of Chicago.
And Adam Alter is an associate professor of marketing at NYU Stern School of Business and the author of Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked.
Panel, welcome to The Big Question.
Adam Alter, let me start with you. What is tech addiction?
Adam Alter: Well, tech addiction is one of the behavioral addictions. So we think of addiction typically and traditionally as related to substance use. So that’s how it’s been used for the last hundred years. And before that, originally the word addiction referred to slavery or enslavement by any means. Generally, it happened to be from a substance.
That’s changed recently. We’ve become so successful at designing experiences that don’t involve substances that you can actually make people addicted to those without the physiological consequences of ingesting a substance.
Hal Weitzman: But when we say tech addiction, do we mean that we’ve lost some power over our behaviors?
Adam Alter: Yeah, the way I define it is tech addiction or behavioral addiction is any behavior that in the short term you wanna enact over and over again. You do it compulsively. You enjoy doing it, but ultimately in the long run, it’s bad for you. It harms you in some respect. So it might harm your psychological well-being. It might harm your social well-being. Maybe the relationships you’ve formed will be fractured by your relationship to screens or technology. It could be harming your physiological well-being. Maybe you’re sedentary because you spend so much time in front of a screen. You walk into a pole. You drive badly, things like that. There are all sorts of things that happen when we spend too much time on our screens. And also financial consequences. So a lot of people end up spending much more money than they’d like because of their attachment to their screens.
So when you put all of that together: a substance when you’re addicted to it. It’s something you wanna do. You keep using the substance even though you know that in the long run it’s bad for you. And I think that also applies to our relationship with screens.
Hal Weitzman: But there must be distinctions between substance addiction and technology addiction. What are those differences?
Adam Alter: Yeah, look, it’s one of degree, I think. So when you ingest something like heroin, when that (clears throat) enters your body, the relationship between that drug and your body, it’s an immediate relationship. It’s much more powerful. It’s much more intense than when you use a screen, but a lot of the consequences are similar. And for some people, for a small part of the population, you get similar rises in all sorts of different chemicals, neurotransmitters that suggest that there’s a similar consequence to firing up World of Warcraft, or when you start playing the video game that you’re addicted to, if you happen to be one of those, that small percentage of the population. The responses are actually not that different from ingesting a drug. So it’s one of degree rather than one of kind. They’re actually quite similar in—
Hal Weitzman: OK, but is it healthier to be a tech addict than a substance addict? (Epley laughs.)
Adam Alter: Yes, yes, it is. I think it’s more immediately—it’s less immediately unhealthy to be a tech addict. I think in the long run in a macro sense, when you add up all the tech addicts together, I think there are major negative consequences for society. I think a lot of the time we focus on the individual, but really the negative consequences are often in how we interact with each other—playgrounds, restaurants, the dinner table, those are all, I think, degraded because we all spend so much time on screens.
Hal Weitzman: OK, Nicholas, let me bring you in, because that’s one of your areas of expertise. What will be the social consequences of tech addiction and other kinds of simulated behavior?
Nicholas Epley: Yeah, so one thing we know about human well-being is that other people are deeply important for it, the quality of our connections. The nature of our relationships to other people is pretty easily the biggest determinant of our happiness or well-being, often on a moment to moment basis. And anything that distracts us from positive connections with other people has the potential to undermine our well-being. And you might really enjoy your phone, but the data suggests that doesn’t bring you the same kind of well-being that connecting positively with another person would bring you.
Hal Weitzman: Yeah, even though—because sometimes you hear people say that these games—you mentioned World of Warcraft and similar type games—I’ve heard it said, these team-building type skills, they do actually promote interaction across cultures and time zones and countries. Isn’t there some connection there? Or is it just a different type of connection?
Nicholas Epley: Well, they’re really weak. So I think one way to think about it is—Adam was saying it’s a matter of degree of it. Let me just describe one simple experiment to you. This was one where adolescent girls were put into a stressful situation, what psychologists refer to as the Trier stress test. You’re about to stand up and give a speech. It’s a stressful event for anybody to stand up and give a speech.
And these adolescent girls were able to talk to their mothers. They were able to talk to their mothers either over the phone—so they could hear mom’s voice, which is a more direct, psychologically immediate sort of connection with somebody else—or they were able to instant message them. That is, type to them. Or they weren’t able to connect with their mothers at all.
And what they then measured was how stressed these kids were before they’re about to stand up and give their speech. And the data suggest that connecting over instant messaging reduced stress a tiny bit compared to not being connected at all, but it was trivial compared to actually being able to talk to your mother.
The sense of connection with another person, the sense that you understand or know another person can be hard a little bit through technology, particularly if you have an existing relationship with somebody. But it’s nothing like the kind of connection that you feel if you’re actually sitting down, listening to somebody’s voice. We find that the voice is really important for conveying the presence of another person’s mind, and those direct social cues are most powerful for creating a sense of connection.
Hal Weitzman: So a telephone call would be better in some sense than a text-message exchange?
Nicholas Epley: Well, depends where you’re talking about better than. Better than in what sense? The voice contains an awful lot of cues to the presence of another person’s mind. And text contains some, but it lacks a lot of the paralinguistic cues that make it clear that you’re feeling something or even thinking carefully about something, right? So when I slow down and pause, you get a sense that I’m mulling something over, and when I speak up, you get a sense I’m excited. And so those vocal cues, those verbal cues give me a sense that you’re sort of alive inside, and that’s what we find creates a good sense of connection with another person.
Hal Weitzman: Just one more thing about your research. You did this very, kind of, nice research about people on trains and how they behave with strangers. The result of that was that if people interact with people they don’t know, they think it’s gonna be an unpleasant experience, but they end up happier. So I guess my question is, if we’re not doing so much about because we’re on our phones, as a society are we less happy than we were in the days when we were forced perhaps to speak to strangers?
Nicholas Epley: So I think, I mean, I think the addictive quality of a phone is that the positive benefit you might get from checking email or something is immediate and quick. And so in any moment where you decide, do I sit down and talk to Marshini, whom I haven’t talked with before? Do we have a conversation, or do I quick check my email? Well, there’s some uncertainty here. It’s gonna take a little bit of effort to say hi and connect with you, but this is quick and immediate.
And we make those kinds of decisions, those approach-avoidance decisions, with other people, like, constantly throughout the course of our day. And I think phones and screens have the potential to turn each one of these decisions into one that tilts us into avoidance of other interactions with other people, and to the extent that does that sort of systematically and widely, then yes, I would expect it to have a meaningful effect on people’s well-being since we’re just not as connected to other people around us.
Hal Weitzman: And given that this trend does not appear to be going away, are we going to become less and less happy unless we find a way to become more mindful?
Nicholas Epley: That’s hard to say, in part because one thing that technology can do is it can adapt. And so it’s possible, I suppose, that technology could be engineered in such a way to give us, like—it can give us the same kind of addictive rush that a chemical substance does—give us the same sense of connection, somehow, as talking or connecting with a real human being. I haven’t seen that technology yet, and I wouldn’t bet on it.
Hal Weitzman: OK. Marshini Chetty, let me bring you in because you’re an expert in the other side, how developers of websites and apps and games keep us hooked. What are some of the techniques and how are those techniques getting better over time?
Marshini Chetty: So yeah, basically a lot of websites these days have been using things like, as you called, evil nudges or dark patterns.
Hal Weitzman: This is when we were talking earlier before the recording. (Chetty laughs.)
Marshini Chetty: Yeah, evil nudges, where people are basically being nudged into a particular direction to make a decision that they may not have otherwise chosen to do. So for example, if you think about watching a Netflix video, and you’re maybe watching Game of Thrones, or trying to watch your latest series, it automatically autoplays the next episode.
Hal Weitzman: And they made that change not so long ago.
Marshini Chetty: Yeah, they made that change. Exactly. So, and that’s geared toward keeping you on the site, keeping you watching videos. And these patterns have become common, not just in terms of video watching, but they can also keep us hooked on to games. So a lot of kids, for example, are being hooked into games where you have to basically play by appointment. If you think about Pokémon Go, some Pokémon are only available at night. They’re nocturnal. So then, even if you’re supposed to be sleeping, you have to be up to catch that Pokémon, right? So there are lots of these tactics where the technology is actually designed to get you to stick on that technology for longer. So it’s not just that we might be susceptible to these kinds of addictions, but in a lot of ways the technology is being engineered to give you these quick fixes.
Even, for example, with checking your email. If you think about another notification we get, like Facebook likes. That set—so another author, Natasha Shaw, said that these likes are not incremented on just a one-off basis. They’re basically—there’s some sort of variation in terms of when they’re actually updated. So you get this variable-reinforcement schedule, and this keeps you sort of wanting more, because you don’t know when that like count is going to be updated. And so you’re sort of always constantly checking.
So these kinds of rewards, they’re all sort of designed in, and also help us or they don’t help us, but they keep us addicted to the technology. So that’s one of the other issues. It’s not just that we might be susceptible to this, but also these designs are playing on our cognitive and behavioral biases to actually manipulate us into staying on the technology longer.
Nicholas Epley: This is something my colleague Richard Thaler refers to as sludge. (Chetty and Alter laugh.) The evil nudge is sludge.
Hal Weitzman: But what you were talking about essentially is the game designers or the app designers are trying to keep us on the app or the game for longer, which surely that’s always been the case. Right?
Marshini Chetty: I think so, but I think one of the changes with some of these patterns is that now they can happen on a much larger scale. So if you think about in the real world, these kinds of marketing tactics are very common. You think about candy bars that are sort of lined up just at the checkout line. When you’re spending a lot of time waiting in a queue, that’s designed so that you’re going to pick up that candy bar because you’re waiting around anyways. So it’s not new per se, but I think with scale, these patterns are more harmful because now companies can do them at a much larger scale. They can also test to see which nudges are Adam, like, is he more susceptible to, right? Or people like him more susceptible to. And they can kind of compare these and make it more personalized to him. So I think—
Hal Weitzman: So we might get a customized sludge.
Marshini Chetty: Exactly. So in some ways, I mean, maybe not personalized to Adam per se, but maybe people who live in New York, right? So I think because of that, the scale of which it can happen and the way that it can be personalized make them a little bit different. And because that’s not really been regulated or checked on, I think it’s problematic and could be adding to this addiction problem.
Hal Weitzman: OK, but just to be clear, is there anyone who’s not trying to do this? I’m thinking of, if I were running a charity, I would be trying to keep people on the website or get them, for lack of a better word, manipulate them into giving more money, and that’s been a long practice. Or to relate back to Richard Thaler trying to get people to put money in their retirement fund. So I could be using it for some, what we might consider to be a positive use, but I could be using it to get people to buy stuff that they don’t need or play games for hours during the middle of the night?
Marshini Chetty: Right. So I mean, I think, like, not all, so we call them dark patterns. So user-interface design patterns that sort of coerce you into making a decision that you may not have otherwise made.
But not all patterns are dark, right? So they could be used for things like helping people to opt into insurance plans, and things like this. Health insurance, and so on. But in many cases, they could be utterly deceptive. And so I think it’s not always clear, like you can’t always infer what the designer’s intent is. But I think if you had that information, it would be easier to say that one patent is OK in a particular instance, and one patent is not.
Hal Weitzman: Give us an example just so we understand where the line is between a clear pattern and a dark pattern.
Marshini Chetty: So, imagine a countdown timer on a website that’s trying to get you to buy a particular product. This is creating a sense of urgency. So in the study that we did of shopping websites, we found that in many instances or several instances, some of these patterns are just faked. They’re just randomly resetting, right? So that is utterly deceptive basically. So in that case, I think it’s very clear that the design is deceptive.
Hal Weitzman: A countdown when it’s not clear what the countdown means.
Marshini Chetty: The countdown sort of comes down, and then it just resets. So you’re getting the sense of urgency: I must buy this product now because the sale is about to end. But actually it’s just an utterly deceptive timer. In that case, I think it’s very clear. You could say that this design is harmful in some ways, because it’s deceiving the user, right? In other cases, I think it’s harder to say whether it is or not. So like you were saying, if it’s getting someone to donate more money to an NGO or something like that, perhaps that’s OK. But if it’s in another case where the patent is trying to get people to spend more time on the site or give up more data, or make more purchases, it’s harder to say whether or not it’s a good or bad thing.
Hal Weitzman: OK, so a dark pattern might be in the eye of the beholder, something like that.
Marshini Chetty: I think some cases, like I said, where there’s deception, which we found some cases where things are basically just being, like, they’re randomized, like time is being reset, and so on, you can say this is utterly deceptive. But in other cases, I think it’s not black and white.
Hal Weitzman: Yeah. It’s interesting because in your book, Adam, Irresistible, you talk about some of the techniques that Marshini is talking about as well have their origin in storytelling or just great board games, that you could equally spend 10 days playing. Board games. Or a Dickens novel is deliberately written so you have to carry on, read the next chapter. Or soap operas, etc. So what’s different when you get technology involved? We talked about some of it with the scale and the customization. What else is different?
Adam Alter: I think two things have evolved. One is we are much better at this than we ever were in the 1800s or 1900s. This is a new thing that we’ve developed the ability to do. There are psychologists, people with backgrounds like our backgrounds or backgrounds in computer science, who understand this better than anyone ever did before. So you can be much more purposeful about designing things with this in mind, and, I think, people could be even 20 or 30 years ago. That’s the first thing. The second thing is you don’t even need to be purposeful anymore. Because you can throw a billion data points at the wall and look at what sticks best, and you don’t need theory. So you don’t need to understand humans. You don’t need a degree. You don’t need a PhD. All you need is billions of data points looking at how people engage with an experience.
We can go back to World of Warcraft. One of the things they do is they’ll set up different missions, and they’ll kind of have them battle each other to see which one’s most engaging. So there’s this process of kind of weaponizing the game over time through different rounds of combat. So you might have, say, there’s a mission. There are guilds of people that go and group missions together. One version, you’re saving a male character; one version, you’re saving a female character. And the designers of the game might say, we’ll release one version to one half of the population and one version to the other. Turns out people spend an extra 10 minutes on average saving a male character over a female character. So they say, well let’s put the one that lost that battle aside, and let’s privilege the one that really stuck. And then they might say, well, I wonder if it’s gonna be better to have this mission happen in a forest or by the ocean. If you keep doing that, you have these little trials, by the 20th iteration, this is a weaponized version of the experience. You could never do that before. You didn’t have the feedback, the access to the data.
So I think between our sophistication and our ability to not even have to be sophisticated to get the answer, I think we’re on a losing battle. On the other side of the screen, we are fighting against just reams of data and people with a really smart sense of what will make us tick.
Hal Weitzman: And I guess with machine learning, you don’t even need a person there to sort that information out.
Adam Alter: You don’t even need a person.
Marshini Chetty: And what Adam’s referring to is called A/B testing. So companies do this all the time, where they’re testing two different versions of an interface. It could be testing different colors to see which one’s more likely to get you to buy something, or could be testing different versions of a game that, again, gets you to spend more time on the site. So it’s a known term, and it’s something that we actually even teach about in my class. Here’s how you do A/B testing to see how to optimize the system for a person.
Hal Weitzman: Right. And Nicholas, isn’t it possible that all this could actually just give us, when we talk about customization, a better experience? We get on digital experiences that are customized to us. We’re not wasting time.
Nicholas Epley: I think it depends, again, on what the designers’ intent is, as Marshini was saying. If you’re trying to optimize an experience for you, then I sort of wanna know what did 10,000 people who went on this cruise versus that cruise feel about it? And then if they tended to like this one, then yes, I wanna go on that one. What did a million people who tried this in their marriage versus that in their marriage, how did those marriages turn out? Oh, this one’s better.
I mean, all of this is describing the scientific process. This is experimentation. Researchers and scientists have been doing this for decades and centuries now. And so the method isn’t problematic. There’s nothing wrong with the method, as Adam was saying. We’re now at a position, though, where we can use these methods so much faster. I mean, it takes me a month, two months, three months to run an experiment, or we have two people talking to each other before we learn something. Prrh! They do this in a night with 10,000 trials and can figure out how to best design their game. And so the speed with which you can learn is extraordinarily fast.
That’s not necessarily a bad thing. It’s just if the intent is to capture our attention, which is what all of these websites and social media sites need in order to succeed, if the intent is to capture our attention, then it’s creating addiction, right? And that’s not a great outcome for us, that kind of addiction.
Hal Weitzman: Are we somehow more hardwired to become addicts to technology than to substances? You say in your book, we think of substance abuse as a marginal activity. Tech addiction is very, very mainstream.
Adam Alter: I don’t know that we’re more susceptible to it. I think if you put anyone in the right or wrong situation, they will develop an addiction, whether it’s to a substance or to an experience. So there’s a lot of evidence for that. It’s really about what you experience, the environment you’re in. We just all happen to be in the environment now where we are engaged with products that are designed to addict. That’s not the case with substance addiction. Except for cigarette use in the ’60s and ’70s, it’s generally been marginalized. It’s been certainly not part of the majority’s way of interacting with the world. That’s not true for screens. And that’s evolved in our culture over the last 20 or 30 years in particular. How we got here is a separate, interesting sociological question. But now that we are all here, we are certainly susceptible to it.
Hal Weitzman: And it seems like, I feel like as an outsider that behavioral science and the rise in popularity of behavioral science in a way has fed into this problem. Every company that has a website has a staff behavioral scientist who’s there to help design these programs, right?
Marshini Chetty: Exactly. And building on what Adam said, also, I think the other thing that’s changed over time is that we all have access to these technologies all the time. And they’re becoming more ubiquitous, right? So it’s not just your phone in your pocket, but it might be an Alexa in your home or a smart TV, and so on. Right? So everyone has access to this in a way that makes it really hard to sort of avoid it, and it’s also not really frowned upon. So checking your email constantly, even though you declined to speak to me, which was clearly the better choice. (Epley chuckles.) It’s not like anyone in the room is going to say, “Oh, you shouldn’t be doing that.” Right? And in actual fact, we’ve gotten quite accepting of it. And Adam was saying earlier, it’s quite common to, say, walk into a coffee shop and seeing everyone on their phones instead of talking to each other, or seeing parents with kids who just have iPads. And again, they’re not really talking to the kids. They’re just conversing with each other, and the kids are just engaged with their devices. So I think because it’s ubiquitous, it also makes it harder to stay away. And it puts us in a place where everyone can become addicted if they have access to the technology, which is different than substances.
Hal Weitzman: Right. And Nicholas, what are your thoughts on behavioral science and the way it’s used, to an amount, it’s been weaponized? I mean, companies have taken the insights that come from social psychology and used them to design tech architecture in a way that makes people more addicted, right?
Nicholas Epley: Yeah, I think I don’t have anything particularly brilliant to say about that, except that any good method can be used to ill intent. Right? So a scientific method that gave us life-saving medicines also gives us drugs that cause amazing addiction, right? The same opioid that you take to deal with knee pain, say, can also ruin your life. The same scientific practice that allows us to understand what makes people happy and sad, what helps people make good choices and bad choices, to help people live better lives, can also be designed to help a company make money. And so the method, the use of the method is really driven by the user’s intent. In the case of these technologies, the intent is not always with the users’ good outcomes in mind. That’s what? The users’ attention.
Hal Weitzman: I guess the difference is if the opioid epidemic, take one example, has been such an immediate and very kind of graphic impact that we have to do something about or at least pay lip service to it, whereas as Marshini said, this is a sort of a slow-creeping phenomenon—nobody is dying directly from it; we’re walking into lamppost; we’re not being happier than we might otherwise be—is that sort of like, what is the tipping point where it becomes recognized?
Nicholas Epley: So we learn from lots of things. We learn that heroin is bad for us because you see a heroin addict and that’s very clear. The negative consequences are obvious. Not all the negative consequences of this tech use are obvious.
So for instance, if Adam and I are texting to each other—. It turns out we find in our research that a person’s voice is really critical for conveying the presence of mind. You sound more thoughtful, intelligent, rational. I make a different inference about your intellectual character when I hear what you have to say than when I read the same thing. But if Adam and I are texting back and forth, and some of my judgment about him comes from the media I’m using. He seems just kind of like an idiot here in this exchange. It’s not obvious to me that the text, the nature of the text is what’s causing that problem. That is, the effect is not obvious to me because I don’t know how I would have judged Adam if I was in a different situation, if I’d actually been listening to him, and how that dynamic would have been different. I just come to think he’s kind of an idiot, and we’re at loggerheads like this, right?
So I don’t think a lot of the, at least some of the negative social consequences of tech are not something we get feedback on. We don’t learn. If I don’t talk to Marshini on the train, I don’t learn that we would have had a great conversation and she’d be super interesting. I don’t learn that. I just learn whatever was on my email. So I don’t find out that the tech was keeping me from an otherwise pleasant experience. We get really smart in the world as human beings when we get really good feedback. Tech doesn’t always give us great feedback. So I’m a little nervous.
Hal Weitzman: And I know that you care a lot about well-being as a measure in happiness. When people aren’t—they don’t have a lot of well-being, or their well-being is low and they’re not as happy as they might otherwise be, they do behave in ways that have consequences. Maybe they vote for a political candidate that they might not otherwise. It does have consequences, right?
Nicholas Epley: They choose alcohol to make themselves feel better, they go shopping when they shouldn’t, wherever they do all kinds of things.
Hal Weitzman: Yeah. So there are social consequences. I just wonder, is it hard to make the link there? Somebody votes for someone because they spent too much time on their phone?
Nicholas Epley: Yeah, no, right. That’s very hard. We can isolate some of these things in our experiments, but the long causal chain is hard to see out there in the world, which I think is one of the additional reasons why it’s hard to see how all of this stuff that’s really impacting humanity at a really unprecedented level, to see how it’s all the different ways in which it’s affecting our social life. So is the divisiveness that we see here in the United States partly because technology has really enabled us to connect easily with folks who are part of our tribe, right? You can find a Facebook group that’s right on with you that creates endless opportunities for us-versus-them kind of thinking. I don’t know. That’s above my pay grade.
Adam Alter: I will say one thing that I’ve noticed. So when I wrote the proposal for this book in 2014, which is only six years ago.
Hal Weitzman: —Irresistible.
Adam Alter: Irresistible. And when I spoke about it to a number of potential editors, some of them said this is just a nonissue. No one cares about it. (Epley laughing) No one thinks it’s a big deal. I don’t even know why this is a storm in a teacup. And the editor who I work with said, “No, I think you might be onto something, and, “This is interesting. Let’s run with it. Let’s see what happens.” No one says that anymore. I used to have to spend the first 20 minutes of any talk saying this is the thing you should care about. I don’t have to do that anymore.
So it may have taken six years for people to get there. I think there were certain events that pushed them along. But everyone now understands that this is a concern. So I think the consequences are not as immediate as watching someone experience heroin addiction. But certainly we’ve got to the point now where I think we are convinced as a society that this is a problem.
Hal Weitzman: And your sense is that it’s—I know you published the book in 2017, I believe.
Adam Alter: Yes.
Hal Weitzman: So in those three years it’s gotten worse?
Adam Alter: Yeah, yeah, worse, but also I think people are better about it now. When I published the book, parents were saying to me, “I don’t know what to do about my kids.” Now kids are saying to me, “I don’t know what to do about my parents.” So I think younger people are getting better at dealing with the technology, they’ve kind of grown up with it. And it’s older generations now or adults who are struggling with it more. So there’s been some evolution.
Hal Weitzman: Do most—let’s take adults for a second then. Do most adults say that they feel they spend too much time using technology?
Adam Alter: In a room, say 1,000 people are in a room, I’ll ask them from 1 to 10 to indicate where they lie on a spectrum. From 1, I’m perfectly happy with my interactions with technology, to 10: This is destroying my life. I need to make major changes. In almost every room I’ve ever done this in with, again, thousands of people, the most common response is between a 6 and an 8. It’s usually like a 7, maybe an 8. So most people, I think—and these are usually adults—say, it’s not ruining my life, but it’s a real problem for me personally, and I need to do something different. So I think it is something they recognize and wanna change.
Hal Weitzman: So I wanna think about how we get to that change, because we have a slightly different response, I think. Your book is partly a guide for people to address their own lives. How much is this just each individual taking responsibility for breaking their own addiction? Because that’s not how we think about opioid addiction.
Adam Alter: Yeah, now that’s true. So the two basic approaches, if you wanna divide them—there’s the grassroots, bottom-up approach, where each individual has certain tactics to deal with the problem. The top-down approach is government, legislation, workplaces, and you have to be hopeful. That has to happen either through pressure from consumers or pressure on governments, on legislators, and so on. That’s happening in some parts of the world. So East Asia, that’s happening. Northern Europe, Western Europe—certain parts of those regions you’re seeing that. Not as much of the United States right now. So I think right now, because not much is happening at the top down level apart from a few isolated organizations, we as individual consumers have to do the grassroots work ourselves.
Hal Weitzman: But you expect this could evolve into something different.
Adam Alter: I think there’s enough pressure that over time it will evolve. It seems like it’s leaning in that direction. And again, more and more countries of the world are introducing legislation. Perhaps the government in the US will at some point do that. This government doesn’t seem particularly interested in this issue, but you can never predict what will come next and so on.
Hal Weitzman: The current US administration is not particularly known for its proregulation stance. So this seems unlikely to, although you never know. It does have a problem with big tech, so it’s possible.
Adam Alter: That’s true.
Marshini Chetty: Well I was gonna add that actually, there are some senators who are interested in this problem. So Senators Mark Warner [Democrat of Virginia] and Deb Fischer [Republican of Nebraska] have introduced an act called the DETOUR Act, which is the Deceptive Experiences to Online Users Reduction Act. And it’s basically trying to get something in place to regulate the use of dark patterns. And one of the patterns that they’re specifically concerned about are those that encourage kids under the age of 13 into these compulsive gaming habits. So I think in particular, in the US, at least from what I’ve seen, there has been some legislation geared toward kids. And that is more concerned about children in general being on technology. So I think it’s the American Association of Pediatrics, or they sort of also change their guidelines for screen time, and we’re learning more about screen time, even in the devices themselves. More operating systems have rolled out the screen-time awareness kind of tools.
Hal Weitzman: Right. They now tell you how much time you’ve spent using their own device?
Marshini Chetty: Yeah, exactly.
Hal Weitzman: It could be a kind of a nudge potentially?
Marshini Chetty: It’s kind of a nudge. I was gonna say that I agree with Adam that on some level as an individual, you should try to curb your tech addiction. But I also feel like you can’t expect everyone to do that. Not everyone is informed about this. You might be a child, in which case you need help to protect yourself. You might be someone who’s an elderly individual, someone with a cognitive impairment. And in those cases, I think that regulation is needed so that someone is providing some oversight for this. The DETOUR Act that I mentioned, unfortunately it’s only geared toward big companies with, like, over 100 million monthly active accounts, right? So they can’t go off to everyone, but at least if they make an example of some of the bigger players, then hopefully other people will follow suit. You can’t police everything. But I do think there’s a place for that as well.
Hal Weitzman: But you’re talking still about an age, sort of, restriction.
Marshini Chetty: Well, age restrictions or any sort of manipulative patterns that keep you hooked onto a site. They focus on that in the regulation.
Hal Weitzman: So it could be dark patterns.
Marshini Chetty: It could be dark patterns in general that are sort of manipulating consumers into spending more time online. Right? So, I think there’s a place for that too.
Hal Weitzman: Nick Epley you’re a professor at a business school. What about the role of the businesses in curbing their own behavior? Maybe preempting some of the regulation that could come from that one.
Nicholas Epley: Yeah, so businesses need to make money to be able to sustain themselves, and as a general rule, they haven’t, I would say, been great at regulating their sort of pro-social orientation until it’s also aligned with them doing well.
Hal Weitzman: But do you think this could be part of an environmental social tide?
Nicholas Epley: It absolutely could, and to the extent that investors start really caring about that with their pocketbook, to the extent that they do, then it will matter. And you certainly—there are companies that are trying to do real good out in the world. I think Facebook, for instance, probably has good intent behind lots of its products. One thing that’s maybe not so obvious all the time, though, is how their business practices might detract from that goal.
So a company like Facebook, for instance, has to make money by drawing people’s attention to it, because that’s the only way they make money through ads. That’s their business model. If Facebook wanted to maybe design a product that was systematically better for a user, they might also design a separate channel where people have to pay for a subscription service. And then they’re not incentivized to keep you hooked to the device. Businesses can make choices that are more socially responsible. My hope would be that as the negative social consequences of these become more widely known, they become better at it.
Hal Weitzman: OK. Welcome back to our panel. We’ve had more than a month now of lockdown. How has that changed your thinking about technology and screen addiction? Adam Alter, let me start with you.
Adam Alter: I don’t think my thinking has changed much about how we engage with screens, the benefits, the costs of engaging with screens. I think a period like this one where you’re forced to use screens just throws into relief how important it is to understand how to maximize the benefits to get as much good from screens as possible and to minimize the costs. So this goal that people have long had of disconnecting completely from screens or going for very long-term sabbaticals from screens, I’ve never thought that’s realistic. It’s not really a goal I’ve ever really had. I think the key is to understand as much as you can about what screens are doing and what different aspects of screen time do to us so that you can then decide how to structure your life. And if we’re spending a lot of time in work meetings in front of screens, that’s probably OK. If screens are allowing us to connect to people we can’t be in the same room with, that’s probably OK.
So I think, one of the interesting things about this period of time is that what it’s doing is it’s keeping us apart from other people. I was thinking about this, 20 years ago, this would have been a much more difficult time socially. I think we would all have been much more isolated. So that’s the kind of positive, miracle side of screens, which is I think very lucky. So it’s really—the issue is just trying to do as much to cultivate the good and to minimize the bad.
It’s important to recognize that screens are not monolithic, that there isn’t one thing known as screen time. You could sit in front of the screen doing work in a way you might have done two or three months ago before this period of time. You could have birthday parties in front of a screen. You could watch mindless content. There are a lot of different things we can do.
I think one important step for everyone to take especially during this time is to do a sort of unofficial audit of what they’re doing with that screen time and get a sense of what you’re doing. Maybe track your usage for a couple of days, whether you’re using an app on a phone, or whether you’re just thinking mindfully about what you’re doing. And at the end of it all, try to break it down into the benefits, the costs, and what you’re actually doing. So understand the content, what you’re engaging with, which is what a lot of these screen-time apps will tell you. And what you wanna try to do, I think, most of the time is to think about the opportunity cost of spending time in front of the screen. In other words, if I’m spending time in front of a screen working, I’m typing something or I’m sending an email, I probably would have been doing that anyway, and if it’s an important part of my work life, then that’s probably OK. If kids are spending a lot of time in front of screens, and it’s keeping them relaxed and happy, and it’s just really something you don’t normally do, but you’re doing more than usual at this period, I think that’s OK. But if you happen to have a backyard where you are, or there’s a way for you to get outdoors safely while socially distancing, or you have other opportunities for the kids to engage in other activities that might be more enriching, then I think you wanna balance that.
Of course there’s also the balance that is required as an adult, as a parent. It can be exhausting parenting kids inside, especially if you’re in a small space for a long time. So you have to balance your own mental health with what kids are doing. So I think really there is no kind of silver-bullet answer here that says, this is the perfect way to manage the situation. It’s difficult, there are a lot of constraints on a lot of us. I think the best thing to do is just to try to strike that balance and really to think about the feedback you’re getting both from yourself in how your responding and how you’re feeling, but also from your kids. If they need downtime, screens are very good for that. You give them the right kind of content, and they’re enjoying it, I think there’s value in that. There’s inherent value in something people enjoy, and kids enjoy. And it also gives the adult a little bit of a break too. And I think we have to be more lenient than we perhaps normally would when it comes to screen time.
That’s sort of my general rule in this time is: be lenient. Do the thing that works best for you, that maintains your mental health, that gives you a break in the stock gap. And if that means slightly more screen time, perhaps of a kind that you wouldn’t normally pursue, I think that’s OK right now.
Hal Weitzman: Nick Epley have your thoughts on the use of technology and screens changed as a result of the experience of this crisis?
Nicholas Epley: So like Adam said, I don’t think my thinking on the impact screens have on us has changed over the last month or two, but it does highlight different aspects of the effects that screens can have on us, as Adam was remarking. Screens are just a tool, and we can use them for good or bad depending on what we use the tool for. I study social connection in particular in my research, and what we’ve seen over the last month or two is just how good screens can be for connecting with other people as long as we use them in the high-fidelity ways that they’re good for.
We’re being asked right now to socially distance from each other. But that’s actually a misnomer. What we’re really being asked to do is to physically distance from each other. But we can use technology to keep ourselves socially connected, even at a time when we’re physically apart, as long as we use the technology optimally. And that means connecting with each other in the way that we’re doing here, using voice; video doesn’t hurt either. But in particular, using media that involve voice to connect with somebody else, that really is what we find in our research creates a sense of connection to others.
Hal Weitzman: And Marshini Chetty, what are your thoughts about how people should manage their relationships with screens during this period?
Marshini Chetty: I think that we need to be even more conscious about the way that we’re spending time on technology during the global pandemic, and that’s because we’re dependent on technology for many different reasons. For example, I’m homeschooling my kids. And so now they have to be on the screens, almost daily just to do learning. We also use screens to connect with other people. Like people that we haven’t seen in a long time. And sometimes we use screens for entertainment. So for example, they’re on screens right now so that I can be on this call.
So I think that we need to be mindful of how we’re spending time, and there are a few different things that we can do. One, I think we have to go easy on ourselves, because to manage anxiety and things like that during the pandemic, we might be on screens more than we’re usually on screens. Two, I think that just like Nick and Adam said, not all technology use is bad. And so we can have positive uses of technology. For example, my kids are on technology for school and they need to be on technology for school. But at the same time, I can’t forget those rules that we thought about or talked about before this pandemic began, which is: if we’re not being mindful about how we’re spending the time online, like Nick and Adam said, then it can be bad time. So we should make sure that we’re meeting our basic needs.
So for example, make sure you’re getting enough sleep. Make sure you’re eating enough before you decide to watch that extra episode of whatever it is. You need to have some downtime, right? And the same with the screen time with the kids. For example, I have to now make sure that they’re not spending all their time on screens. Many people are reaching out to me for play dates and things like that, but we can’t always be online. And so I have to make sure that we have boundaries, and that we have time where we have screen-free time. And so that was the same before the pandemic. It’s the same now. Although we’re reliant on the technology, we can actually be mindful of how people around us and how we’re engaging with the technology. So for myself as a parent, I have to model good behavior. I need screen-free time, and I need my kids to see me having screen-free time. And the same with them. I need to make sure that they are having those boundaries and they have some sort of routine and schedule. And we can do those kinds of audits that Adam and Nick mentioned.
Hal Weitzman: But our time is up. My thanks to our panel, Nick Epley, Marshini Chetty, and Adam Alter. For more research, analysis, and commentary, visit us online at review.chicagobooth.edu, and join us again next time for another The Big Question. Be well and goodbye. (soft music)
People tend to be more persuasive when talking about the appeal of junk food.
Getting Emotional Could Promote Healthy EatingHow much we have to say about different events can reveal stereotypes and prejudices.
Why Do We Say Less When a Black Child Goes Missing?The expectation of symmetry in social relationships can influence behavior and even deter crime.
Line of Inquiry: Anuj K. Shah on Why Learning about Others Makes Us Feel Less AnonymousYour Privacy
We want to demonstrate our commitment to your privacy. Please review Chicago Booth's privacy notice, which provides information explaining how and why we collect particular information when you visit our website.