Chicago Booth Review Podcast A Nobel Laureate on the Limits of Evidence-Based Policy
- January 17, 2024
- CBR Podcast
In recent years, there’s been a strong push to elevate the use of data in public decision-making by using evidence-based policymaking. In a 2019 essay for Chicago Booth Review, Lars Peter Hansen of Chicago Booth and the University of Chicago said the term evidence-based policy was “a misleading portrayal of academic discourse and the advancement of understanding.” In this episode of the Chicago Booth Review Podcast, Hansen and his Chicago Booth and University of Chicago colleague Kevin Murphy discuss the issue.
Hal Weitzman: In recent years, there’s been a strong push to elevate the use of data in public decision-making by using evidence-based policymaking.
It would seem to make sense to use data to inform lawmaking. But not everyone is enamored of the idea. In a 2019 essay for Chicago Booth Review, Lars Peter Hansen, of Chicago Booth and the University of Chicago, said the notion of evidence-based policymaking was “a misleading portrayal of academic discourse and the advancement of understanding.”
Welcome to the Chicago Booth Review Podcast, where we bring you groundbreaking academic research in a clear and straightforward way. I’m Hal Weitzman.
Evidence seldom speaks for itself, but instead requires a model or a conceptual framework, noted Hansen, one of the winners of the 2013 Nobel prize in economics. And models have limitations, which means researchers and policymakers alike should be more open with the public about what we still don’t know for sure.
We invited Hansen and his Chicago Booth and University of Chicago colleague Kevin Murphy to discuss the issue in 2019 as part of our Big Question video panel discussion series.
I began by asking Lars Peter Hansen what the basic problem is with evidence-based policymaking.
Lars Peter Hansen: So, of course, as scientists, we want to appeal to evidence. Evidence is important. Evidence never speaks for itself. It requires some type of conceptual framework, and that conceptual framework is often just as important as the evidence. And so when we’re thinking about policy prescriptions, the conceptual framework has an impact on what the policy recommendations are, and it’s very important to kinda clarify both sets of rules here in terms of what type of conceptual frameworks you’re using as well as the evidence itself, because it just isn’t evidence alone.
Hal Weitzman: So is it your feeling that economists, or economics, has sort of put out too much to policy makers the evidence without the framework to understand the evidence?
Lars Peter Hansen: I don’t know how much is economists putting it out versus what the policy makers are embracing, or the like. There’s always a danger that policy makers can sift across the evidence and then look at the source of evidence to support different policy platforms or different policy positions. It’s very difficult to imagine this being some neutral playing ground in terms of scientific discourse.
Hal Weitzman: But, in general, we want to sort of focus less on telling policy makers what to think and more on, kind of, helping them on thinking about how to think.
Lars Peter Hansen: I think it’s both. We certainly want to help them think about how to think, and along the way then, that will improve the type of decision making. That’ll improve the type of outcomes as well. I think it’s very easy to use the political arena to distort how much knowledge we have and what that knowledge is really telling us is pertinent for policy questions.
Hal Weitzman: OK, Kevin Murphy, do you share this frustration?
Kevin Murphy: I don’t know if I would say it’s frustration. I would say I share a lot of what Lars is talking about here and I think the idea that we’re limited in terms of how much we know, but yet we have a lot to contribute at the same time, I think that’s the key to economics.
In economics, when it comes to policy, as is true for economics generally, it works best when you have a combination of empirical evidence and the underlying economic theory or framework, like Lars talked about. I would actually emphasize another point that I think is consistent with what Lars says, but goes, I think, a little bit in a different direction, which is: one of the reasons you wanna understand and embrace the underlying framework is to understand what’s evidence on a given question. That is, an evidence on a given question is not just an experiment or something we ran on that specific question, but our experience with related and similar events, historically as well as other policy decisions that we’ve made, or even nonpolicy outcomes that we’ve seen in the past. And so, the theory’s important, not just for interpreting a given set of evidence, but by telling us what is evidence.
So for example, if we’re interested in a particular tax, we don’t wanna just look at that one tax as an example of how people would respond. Other taxes can be very informative, and knowing which other cases are very similar, or not similar to a given case, requires understanding the underlying theory, and I think evidence-based policy, to me, where it tends to get off is it focuses too narrowly in terms of what’s the relevant set of evidence. It doesn’t draw on our broader knowledge as economists.
Hal Weitzman: OK, does that mean there are, kind of, good data and not so good data?
Kevin Murphy: Well, not so much. What it means is: don’t just look at that narrow question. Say, “Look, we have a wealth of evidence that says when prices rise that people buy less” and if in a given situation you see prices went up and people didn’t buy less, say, “Wait a minute, am I looking at these data correctly, or is there something I’m missing here?” You know, I always like to give the analogy: You don’t wanna say, “Well, I’ve never thrown a bowling ball off the roof of my garage. I don’t know whether it’s gonna fall to the ground, or float in the air.” Well, you know, we have a thing called the theory of gravity. It probably applies to bowling balls on the roof of my garage. Now, there’s some slim possibility that it doesn’t. Very, very slim. I think the same is true in economics. You come up with a surprising result. Don’t just say, “Well, that’s what the data say.” You say, “Wait a minute. There’s something goofy here.” And you wanna say there’s a broader set of evidence.
Hal Weitzman: OK, but—
Lars Peter Hansen: If I could just follow up on this a little. So I think Kevin’s making a really fundamental point here. When we build models—you know, part of me is an econometrician—and we build econometric models, one of the aims of those is to take situations in which you’ve got, lots of data richness and to be able to extrapolate in places, and where you don’t have that richness. That’s the whole essence of the hardest part of policy analysis is to say, “Well, we have this rich evidence over here. What’s gonna happen when I change the environment in some specific way, and what are good, sensible things to say? And what are good, sensible predictions to make?”
And I think that’s really, really central to this, kind of, having a conceptual framework. That’s where we need to have it in order to have any hopes of doing very ambitious policy making, because if we’re stuck in the situation in which we have to have evidence that bears on the exact question you wanna ask, we would be very, very limited in terms of what we can provide. And so any hopes of trying to draw on a bigger pool of knowledge really requires this conceptual framework and this conceptual basis.
I just think it’s important to remember that that’s a big part of the understanding what the evidence tells us.
Hal Weitzman: Sure, except that extrapolation from one situation to another, from history to present day, or the future, is not always smooth, right? So there could be a challenge in extrapolating from one environment to another. How do you explain that, or how should policy makers think about that?
Kevin Murphy: Well, I think, you look at a whole range of evidence, and you try to look, how consistent are the things I’m trying to draw on? If you see that, you know, there’s a consensus that arises for a broad number of things I look at, I tend to put a lot of weight on that. Experience that we’ve had, in a range of situations that give consistent predictions that seem to fit the data well, get a lot more weight than one-off observations that we see.
And one of the problems with specific evidence is it often is focused on very short-term responses, and one thing we know in economics is that long-term and short-term responses tend to be different. People can make lots more adjustments, and markets can make much more adjustments given time, and those are the hardest things to run experiments on. So experimental evidence often is limited because it has a hard time incorporating, kind of, those market-level responses.
Hal Weitzman: Lars Peter Hansen, you talked about some of the challenges using quantitative modeling in economic policy, and one of them that was mentioned was about the uncertainty. Policy makers typically don’t like uncertainty. They don’t like ranges of numbers. They like specific numbers. Talk about some of those challenges.
Lars Peter Hansen: Yeah, so, there’s kinda this interesting phenomenon that takes place. On the one hand, economists who wanna influence policy making often make statements with great confidence about outcomes, and then different economists will make conflicting statements, all with great confidence, and then the rest of the public says, “Well, economists, what do they know? They’re just sitting out there making these bold predictions and they can’t seem to agree on some of the fundamental questions.” And I think that, in some sense, undermines our long-term impact in terms of policy.
On the other hand, the way the political arena works is politicians feel obliged to go to the public and say, “Well, here’s exactly why I’m doing this, based on full knowledge and understanding.” This is, and so they tend to gravitate toward people with this more-certain knowledge. And if you’re in the situation in which you have some form of uncertainty, that actually gives you the range to look across the evidence and to kinda tilt it in the directions which are gonna support your potential policy outcome, and I think it’s really important to understand when that tilting is taking place, because that tilting is taking place, in part, because of some type of prior beliefs, or prior policy aims, or the like, and that’s where I think things get really muddied up completely.
Hal Weitzman: So the data gets sort of filtered through the framework of ideology, or partisanship, or whatever?
Lars Peter Hansen: Yeah.
Hal Weitzman: OK, Kevin Murphy, you talked about, sort of, you know, the difference between robust data, or using historical data. What are, sort of, the basic principles, the basic scientific principles that we could say: This is what make data dependable. This is the kind of things that policy makers should be looking for, as opposed to, you know, less-robust studies?
Kevin Murphy: I would say the extent you’re relying on, sort of, proven concepts in economics. The idea that, like I said before, that people will respond to prices and that even if I’ve never had a tax before, for example, let’s assume we had an industry, say it was the oil industry. You never put a tax on oil. Well, we know a lot about how the oil industry works from other things that happen. It’s an industry that gets lots of shocks to both supply and demand. We can see how those markets respond. So if you wanted to say, what’s the effect of a tax on oil, you don’t wanna limit yourself to tax-based evidence. There’s tons of evidence about buyers, how they respond to higher prices, and sellers, how they respond to higher prices. And I think one of the problems, as I said before, is that when people come to the tax question, they say “I gotta look for a state to put a tax on.” Well, the state putting a tax on is very different than a national tax, or they put the tax on for a reason. Again, you wanna broaden your set of evidence, but you also wanna temper your predictions. Say, “Look, I can’t tell you exactly how this is gonna respond.” Most of these kinds of responses, we know plus or minus 50 percent is not a terrible number in terms of how we think people respond, but nonetheless, it’s very helpful. And you don’t wanna understate that economics has had an effect.
I mean, look at macroeconomic policy around the world. It’s improved dramatically, and I think economists have contributed to the source of that improvement in macroeconomic policy. A lot of it was learning that we didn’t know as much as we thought we knew. I think that’s probably—to Lars’s example— that probably the greatest improvement in macroeconomic policy has been the recognition that our ability to understand and manipulate the economy is a lot weaker than we thought, and we’ve responded accordingly with, I think, largely positive results.
So, often, that “we know less” message can actually be helpful because it discourages people from doing things that are out-and-out harmful.
Hal Weitzman: Lars Peter Hansen, you talked about economists being confident—maybe overconfident sometimes—in their predictions or in their statements. Is there a structural problem here that might feed into this to do with the way economics research is published, how it gets out into the world? The sort of first draft of an economics paper is a working paper, and that working paper may not be accurate, or may not bear complete resemblance to the final paper that’s published and peer reviewed and has been through that process. But the working paper is the one that often makes it into the media, or makes it to policy people. Is that a problem?
Lars Peter Hansen: I find that to be potentially problematic. You know, the thing we wanna avoid is, kind of, what happened in physics, the cold fusion phenomenon, which some premature announcement of some incredible advance took place that turned out to be undermined very, very quickly. In economics, things don’t get undermined often quite so quickly, and in quite such a dramatic fashion. But I’ve always thought that, and been convinced by this—in fact, Gary Becker was one who just always hammered this through when we worked together at the Becker Friedman Institute—you should communicate things about the stock of knowledge, things that have accumulated, things that have been replicated, things that are basic, which we have lots of supporting evidence for from a variety of sources.
And that the latest working papers is like communicating about the flow. Well, sometimes that flow is flimsy. Sometimes important nuggets come out. But it takes time to distill it all and to figure out how important it really is, and so this rush to the media—I understand universities love to get publicity this way, and it helps them in fundraising presumably—but it’s not the best way to communicate scientific evidence, at least in my view. I think it’s highly problematic.
Kevin Murphy: I would share that view and I don’t think it would be limited to working papers. I think this distinction between the stock of knowledge and the flow of knowledge is critical, because I think the stock of knowledge, as I said before, is valuable. There’s a lot of things we’ve learned in economics. They can be helpful to business people, they can be helpful to individuals, can be very helpful to policy makers.
The flow of knowledge, I think of as essentially toxic. You would not want to consume the flow of knowledge. It gets filtered to the point where ultimately it becomes valuable, but most of what comes out is either not correctly interpreted, wrong. I would include my own research in that. I’ve changed my mind on things over the years, but at the end of the day I think I’ve learned a lot, but it took years of filtering my own research, let alone the research that’s going on more broadly.
And so again, economists have a lot to bring to the table, but the flow is just not what you wanna consume, and that would apply not just to working papers, but I think even what ultimately ends up in the journals, because if you go back and read past journals, yeah, there are a number of articles in there that turned out to be really, really profound, and really changed, and have stood the test of time, but a lot of them have been forgotten, overturned. And it’s not even so much being overturned; it’s just not the right way to look at it, not the right interpretation.
- Hal Weitzman: Framework?
Kevin Murphy: Yeah, exactly! It’s not even being wrong in the sense of, oh, there was a mistake, and that formula was incorrect. That’s usually not what we mean by a “mistake” in economics. It’s just like, wow, I wasn’t thinking about it right. I really needed to think about it another way and I would’ve interpreted the answers differently. And that takes a lot of time, and again, I think we wanna work with that stock of knowledge and leave the flow to percolate and become part of the stock before we really use it.
Hal Weitzman: And I wonder, with the rise of behavioral science and the influence it’s had on economics, whether that might, that problem might be aggravated, because there’s a lot of laboratory experiments, a lot of evidence there that is hard to replicate?
Kevin Murphy: It may be easy to replicate in the lab. Hard to replicate the implications that people would make out of it. It’s that, it’s like, what do I really make out of that lab experiment? What do I make out of the fact that if I give you a dollar, you give some of that to the guy across from you, does that really mean you really care that much about people? Do I see that behavior in the actual world? It’s gonna be different, and that takes its own time.
I’m not saying lab experiments are useless, just that they’re part of that same flow of information. It’s like the latest regression somebody ran. It’s like, you learn something, but you learn a lot more when you can look back at that and 400 other regressions and a bunch of theoretical work to try to understand the output of those regressions, and you end up interpreting the evidence very differently, and what seemed to be clear at the time turns out later, when you look back and go, “Wow, that’s not at all a good description of what happened!”
Lars Peter Hansen: To me, this gets back to another point that Kevin was making, this short-term/long-term distinction. Lots of laboratory experiments, at their very nature, are, you know, the best they can hope to do is capture a very, very short-term type responses to things. Yet, for a lot of policy questions, these long-term responses are really important. And the same is true of empirical evidence. It’s often very challenging to really get very sharp estimates of long-term responses from things, but yet, those are often some of the most important things to try and measure.
Hal Weitzman: What would be your response—both of you have identified this challenge. What would be your response? Do you stem the flow? Do you direct the flow elsewhere?
Kevin Murphy: I think it’s fine to have the flow. It needs to percolate in the profession. I think we should try to resist pushing it out beyond the profession, at least in the near term, until it’s been evaluated and had some tests, but I think it’s also incumbent on those economists who go to get involved in policy to actually step back and say, “What is it that I can say with confidence and how can I communicate the degree of confidence they have?”
If they do that, I think they can be very helpful to people. They’ll find themselves making much less bold statements. They’ll find themselves saying “Well, here’s what economics teaches us. It doesn’t give us the full answer, but it helps us understand.” But I think at the end of the day, the policy makers will value that, and I think that’s what you need to do, and so you need to resist the temptation to be the latest sensational news outlet. I think that’s the biggest problem.
Hal Weitzman: Is there a danger that the people who want to make, if the best economists were to do that, that those whose voices will be loudest and most confident might have less robust research, might have more influence therefore?
Lars Peter Hansen: I guess it depends on what you think of influence. Whether influence is how to get in the newspaper in the next couple years versus what impact your ideas have 10 or 20 years down the road, and so I tend to think in a scientific discipline like economics, it’s these longer-term impacts that are really the important ones, which we ought to be aiming for, and for the short-term answers, the half-life of those are likely to be sufficiently short not to be all that valuable.
But that’s not what the press is gonna jump on because part of that is how do we keep people entertained in terms of reading our newspapers and stuff. I’m not sure I have great answers to that. I do think that we often feature so-called op-ed pieces, and I have no objections to op-eds. My biggest frustration there is there’s not a lot of platform for talking about the evidence part of it in a very serious way, and way too often the opinion part from what’s grounded in scientific knowledge and the understanding is not at all clear from the op-ed. I don’t object to economists having opinions. I think it’s fine if they have opinions. I just think it’s important to also differentiate what’s the opinion and what’s been grounded in hard evidence.
Kevin Murphy: I would say it depends on the channels through which—again, this is related to what Lars said—the channel through which you’re trying to effect policy. I’ve had really great experiences when I’ve gone to talk to people, for example, the Congressional Budget Office, talked to them about policy, and you know, when I’ve gone there and talked to people at a variety of groups, they ask great questions. They’re interested in what the research has to say. They’re interested in the limits of what we can say, so—
Hal Weitzman: You’re talking about people who, on the staff of the Congress—
Kevin Murphy: Yeah, the professional policy people—
Hal Weitzman: Not elected politicians.
Kevin Murphy: Yeah, because they really have an interest. If you really want to add value to me, that’s a great place to really try to talk to people, ’cause you have a waiting audience. Again, it’s not a surprise that the central banks have listened and responded, because they have a professional staff. The people in charge often have a lot of interest in knowing the right answer. It’s not surprising in political debate there’s not a whole lot of interest in knowing the right answer, right? That’s not the name of the game.
Again, economics has a lot to say. It’s not surprising in situations where there’s not much incentive to get the right answer that you don’t get the right answers, and in cases where people are incentivized and have to live with the consequences of bad policy, much more interest in getting that valuable input. Those are the places I think economists can really have the most positive effect.
Hal Weitzman: Right.
Lars Peter Hansen: I guess I’d be happy to echo that. I think, not only the places like the Congressional Budget Office, I think it’s true in the research departments of central banks and all the way up to the leaders of those departments. I think are very, very keen on, kind of, what the state of knowledge is, and we can have a very informed conversations at that level. It’s not gonna show up in the press the next day, but those types of impacts can be durable, I believe.
Kevin Murphy: Absolutely, I agree. I would include central banks, indeed as I said before, I think those are one of the places where there’s been the most listening to the kind of thing Lars and I are talking about, but also some of the biggest impact in terms of how they run their business has really been changed by that back-and-forth. It’s also a back-and-forth because they’ve asked tough questions of people in the academic community. “Well, what about this? We need to worry about this problem. You can go off and worry about your problems, but we got this real problem, can you help us?” And there’s been a lot of back-and-forth, I think, with central bankers and academics that has proved very useful.
Hal Weitzman: It strikes me that central banking might be a case apart in the sense that central banks are often run by academics, they’re staffed by academics, they’re perhaps the most academic of all policy-making type departments, so I wanted to ask you about some specific challenges in policy making more generally and I know you have some views, Kevin Murphy, on how some of the challenges of translating data over, or research over to issues like minimum wage, tackling inequality more generally?
Kevin Murphy: Well, again, I think this where, for me on inequality is where a framework really helps, because when you talk about inequality, you can talk about it in terms of the outcomes and say these people are the winners, these people are the losers, but that’s not very helpful for thinking about, why did this happen? How do you change what’s happened? What’s a logical policy response? There you really gotta get back to the underlying economics. What’s driving up, in the labor market, for example, the wages of one group and pushing down the wages of another group? And I think there we have lots of information that’s helpful for understanding that, for example, the relative demand for different skill groups in a population is an important part of the story, but the supply side is also very important, and one of the things that we know is that if you were to reduce the supply of people in a group for which wages are depressed, that will push up wages.
There’s overwhelming evidence from the US, and elsewhere, that supply matters, and supply responses to the growth and changes in demand, to me, is the natural response on human capital and equality side, but it requires a framework. Thinking about the world not as “Oh, these greedy guys have gotten better and these poor guys have gotten, you know, the outcomes are worse.” You have to ask what’s underlying it.
So you need your framework, but once you got the framework, you can start talking about policy in a sensible way, but you could also help people understand that it took us a long time to get into the situation we are in now. It’s gonna take a long time to get out. Owing to the fact that human capital that is people, one of the most durable assets in the economy. There aren’t that many assets that you produce today that are still gonna be around, and still be an important part of the economy a half a century from now. People are one of those, and if we’re not investing in the people the way we should, we’re gonna suffer the consequences of that for decades to come. We make a mistake on other margins it costs us for a few years, maybe a few months. People, we make a mistake, costs us for a long time.
Hal Weitzman: Lars Peter Hansen, what about some specific issues that have had challenges, maybe climate change is one. Putting a price on carbon, which has been a difficult thing for economists to do, and maybe has been done in a way we talked about. Not in a range, as the research suggested, but in a very specific way, putting a specific price on it?
Lars Peter Hansen: Yeah, I would probably go beyond just putting a price on carbon. I think the price of carbon has been somewhat of a naive conversation in many respects. The way economists deal with it, myself included, is, “Well, here’s this tax. We’re ignoring the implications of other taxes simultaneously.” We envisioned that all countries would somehow coordinate on this, or the like and it’s not necessarily the best policy lever, or even the most realistic policy lever per say, but I think that the more-basic question has to do with framing any type of policy. Here is where uncertainty becomes important, and here’s where empirical evidence is of very limited value, because we’re talking about moving economies potentially into regions in which we’ve had very little experience and so the best, so we can’t just kind of go run a bunch of regressions and magically get credible numbers out of this.
When they go and appeal to evidence from climate science, well, climate-science models are very elaborate, very sophisticated, but they’re also on these type of questions, first of all, they need inputs from the economic side, and second of all, their answers aren’t all that sharp. There’s lots of divergences across a different model predictions and the like, so this is, again, a case of, which I think you can say some things. You can say some things, but to imagine we’re gonna come up with some single number such a as a social cost to carbon, we should pass that on to the EPA pre-Trump and have them use that as a policy lever to try to design policy is really quite naive.
Even the academic literature on this has these numbers all over the map, so I think the more basic question is, for me anyways, is this is a key example where you need models. You need some type of economic frameworks as well as geophysical frameworks put together and kinda interacted to get any type of credible treatment on what are good and sound policies going forward.
Kevin Murphy: Yeah, I think there’s two big issues, I think—more than two, but at least two in what Lars talked about. One is, again, uncertainty over what quote “the number” is if there was a number. The second is understanding when, what that number even potentially means and how you would actually go about using it, so for example, one of the things that was done, was people said, “Well, we’re gonna have this thing called the price of carbon and all decisions are going to have to build into it, the price of carbon.”
Well, there’s a problem with that. The price of carbon exists in a world in which carbon is priced. To talk about the price of carbon in a world in which carbon is not priced is economically not valid. So for example, I can’t go out and take a bunch of EPA restrictions on fleets of automobiles and evaluate whether they’re good or bad, and say any reduction in carbon emissions associated with the vehicles in that fleet should be priced at this quote “price of carbon.” That’s just not correct. That is the framework for which that number would apply doesn’t apply to that situation, and so this idea that we should incorporate that price into all decisions isn’t even supported by the models that were used to generate that price, so I don’t understand, like, that’s a policy that got off the rails. It’s like, OK, if I did this policy, then I would have this thing called the price of carbon and it would have some meaning. It would still have uncertainty, there would still be all kinds of other issues, but to say I’m not in that world in which there is this thing called the price of carbon, and I’m in this other world in which we have a haphazard mix of policies, and then I’m gonna use this thing from that world in this world makes no economic sense, and if you thought about the underlying model that generates—
Hal Weitzman: So would it have been better not to, not to have had a price at all?
Kevin Murphy: You don’t have a price. You haven’t priced it. If you haven’t priced it, then the different margins on which you’re using carbon are going to have different social impacts, and so, for example, it’s gonna matter if I save carbon emissions, well that’s not the end of the story. That carbon’s gonna flow to somebody else, and if they were gonna put it in the atmosphere, then I haven’t really saved anything.
Any world in which we priced carbon across the board, I don’t have to worry about. That’s the unique environment in which I do not have to worry about what’s gonna happen to the carbon I don’t use, because the value of that carbon is equalized across all those other alternatives, so whether that gets used? It’s only used when people are willing to pay the price and it won’t get used if people aren’t willing to pay the price, but when those other alternative uses are not priced, this idea that I should act as if it is priced is just not supported by economics, and that’s an example where people took one framework and then said, somehow, magically, that’s gonna tell me how to act in this very different world, which it doesn’t unfortunately.
Lars Peter Hansen: A very large amount of the academic literature computing the social cost of carbon really does is what’s called a Pigouvian tax rate. It’s like there’s this externality out there, and let’s figure out the right to tax that externality. It comes out as a price when evaluated at the efficient, socially efficient outcome taking into account that externality, so this is where Kevin was talking about the mismeasurement aspect of things. I mean, that conceptual framework isn’t the one we’re dealing with right now on a day-to-day basis, so you can’t just extract a number and put it in that particular fashion and pretend its gonna be valuable in this other setting.
Kevin Murphy: Yeah, it’s not even like it would just be, well, it’s a little bit off. It could be like miles off, like not even close.
Lars Peter Hansen: Yeah.
Kevin Murphy: The fact that it is the number, even if you said in the [inaudible] world where we had a Pigouvian tax, it would be the number, it ain’t gonna be the number in the world that we actually have, and you gotta think carefully about what alternative. That doesn’t mean you can’t do anything, you just have to say for this experiment, is something like that number likely to be close? And it’s gonna vary depending on the experiment. Whether that’s gonna be close or far from that number, but you can’t just grab a number from one situation and throw it into another, and that’s where understanding the theory . . . You don’t just hand this number over to policy makers. “I got this number, here you go. Use it however you want.” It don’t work like that.
Hal Weitzman: Right. I wanted to ask you finally about machine learning. Lars Peter Hansen, will machine learning, could it help us process this flow of information that you talked about, or could it help deaden some of the ideological frameworks that people come to these data with?
Lars Peter Hansen: So machine learning—
Hal Weitzman: Or will it just replicate those ideological frameworks and aggravate them?
Lars Peter Hansen: So, machine learning can mean various things. I think of machine learning as a combination of some very clever computer algorithms designed to allow us to go find patterns in large-scale data sets. In the private sector, they’re largely, these methods have been, to the extent they’ve been successful, that they’ve been for very, very short-term forecasting and not for more-basic policy questions.
What I think could be attractive to economists going down the road is to kind of lever some of the computational tractability that have come out of computer science. Now being, statisticians are now trying to think more formally about how to rationalize and justify the methods, but to put explicit economic structure on things, to actually integrate in a formal economic framework, because without that we’re back to this “let the data speak”–type mentality, and you’re gonna be stuck answering very kind of short-term prediction–type questions, at best.
Kevin Murphy: I would agree. I think machine-learning tools as inputs into the economic analysis we talked about before, that tries to combine an underlying framework with data could be very helpful, because I think one of the problems we’ve always had with very explicit models that people write down, is to what extent are those really helping us because they capture the things we are confident we wanna build in, and to what extent do they appear to help us because they impose some very ad hoc restrictions on the data? In some ways, machine learning can help with that, but not as it’s typically applied, I think now, which is much more framework free, and I’m not saying it has to be that way, but a lot of ways people would apply machine learning today is basically for prediction. I don’t care why it works, but it works.
For policy questions, that’s rarely that helpful, because the why it works is probably gonna change when you change policy. But as a tool, just like econometrics was a very useful tool for helping us understand data. There were tremendous mistakes we were making before we made progress on econometrics. But it’s an input into that process; it’s not really a substitute for the overall process. I would say add it to the mix and the tool kit we have, but stick within that point of view of economics is a fundamentally about concepts, built principles, and data combined usefully together.
If you wanna know how to do that, go read Milton Friedman. He was one of the best out there at combining data and empirical evidence. And you know, he had his ax to grind, so you gotta separate a little bit of that out.
Lars Peter Hansen: Yeah.
Kevin Murphy: But most of the ax he was grinding, he came to as a conclusion. That’s the thing about Milton.
Hal Weitzman: Perhaps we can never escape the ideological framework completely.
Kevin Murphy: No, but you gotta be careful because a lot of ideological framework, well done, comes from a broader set of experiences that people have had. Look, I think the world kinda works this way, ’cause I’ve found that helpful 99 other times, and don’t ignore that part.
Hal Weitzman: That’s it for this episode. To learn more, visit our website at chicagobooth.edu/review. When you’re there, sign up for our weekly newsletter so you never miss the latest in business-focused academic research. This episode was produced by Josh Stunkel. If you enjoyed it, please subscribe and please do leave us a 5-star review. Until next time, I’m Hal Weitzman. Thanks for listening to the Chicago Booth Review Podcast.
Data from India show a decline in the accessibility of regulated medicines, as companies cut back on marketing them.
The Downside of Drug Price Caps: Fewer SalesChicago Booth’s Sam Peltzman analyzed nearly 50 years of data to identify the factors most associated with happiness.
Infographic: Which Americans Are Happiest?An experiment demonstrates that officers can learn to apply critical thinking in stressful situations, reducing the use of force and discretionary arrests.
How to Redesign Police Training to Reduce the Use of ForceYour Privacy
We want to demonstrate our commitment to your privacy. Please review Chicago Booth's privacy notice, which provides information explaining how and why we collect particular information when you visit our website.