Expertise is important, but its predictive powers are limited.
- February 16, 2017
- CBR - Behavioral Science
I’ve always been a little bit worried by my own ability to forecast research outcomes. There’s little data about whether academic experts are better than a smart layperson at predicting certain types of outcomes.
We ran a large online experiment where we had people quickly push the A and B buttons on a keyboard. We then asked 208 academic experts who study decision making— economists, behavioral economists, and psychologists—and several hundred students and nonexperts to forecast which incentives would motivate people to work the hardest.
Academic experts [professors] did a pretty good job forecasting the results, but by several measures they did no better than students or other nonexperts.
I’m not suggesting that when you’re not feeling well, you should go to a neighbor’s house instead of a doctor. Experts have a role to play. But if I’m at a company, perhaps trying to determine what website design format will work best, the advice of people without special expertise may be as accurate as the advice that comes from a decision-making expert.
Devin G. Pope is professor of behavioral science and Robert King Steel Faculty Fellow at Chicago Booth.
New methods of measuring racism and sexism find a larger, systemic impact.
We’ve Been Underestimating DiscriminationThe Chicago Booth Review Podcast explores questions about the role of A.I. in society.
Do We Trust A.I. to Make the Right Decisions?The information we gather from our first impressions is often wrong.
Why Reading Faces Is a Dangerous GameYour Privacy
We want to demonstrate our commitment to your privacy. Please review Chicago Booth's privacy notice, which provides information explaining how and why we collect particular information when you visit our website.