Celebrating Corporate History Can Backfire
A company that plugs its past can alienate those who were previously marginalized.
Celebrating Corporate History Can BackfireI earned $4.25 an hour at my first job. It was at Sagebrush, a now-defunct clothing chain that sold cheap jeans to working-class Midwesterners. In the 1990s, when I worked there, $4.25 was the minimum wage, a callous phrase that aptly captures employers' intention to do the very least they can by you. I took home about 70 bucks a week after taxes—until I was held up while working the register. If you can't put a price on life, you certainly can on labor: after reassuring me that my safety was all that really mattered, the regional manager reassessed my value to the company and gave me a 10 cent raise. Now I made $4.35 an hour, or nearly $72 for an 18-hour weekend.
My second job was cleaning dorms as an undergraduate at Harvard. The pay was considerably better, $9.65 an hour, but it was beastly work. If retail is a never-ending war against the tedium of monotonous tasks and the temerity of capricious customers, cleaning is an all-out assault on corporeality. You are constantly straining or squatting or sprinting up stairs. The work was especially gruesome in June after school let out; a month of 80-hour weeks left my body hurting in ways that make me cringe now.
But once those four weeks were over, I often had two grand in my pocket, enough to fund an unpaid internship, a summer ritual for academic overachievers that, in the status-conscious precincts of higher ed, is the antithesis of scrubbing toilets. In many respects, passing between these workplaces was jarring—so jarring as to seem slightly grotesque.
Work is a curse, or at least that's what I learned in Bible study. When Adam and Eve were driven out of Eden, they finally had to work for a living. We all do—well, nearly all of us—but not all jobs are created equal. That's certainly the lesson I've learned, though at times it seems more like a dirty secret. In fact, some jobs can be so rewarding (monetarily, intellectually, even morally) that we lose sight of the burdens others shoulder. Their work becomes invisible to us—even when they are toiling on our behalf.
“What you don't necessarily realize when you start selling your time by the hour is that what you're actually selling is your life,” Barbara Ehrenreich notes in Nickel and Dimed, her best-selling account of trying to make ends meet on low-wage work. Ehrenreich was writing her blue-collar correspondence right about the time I was pulling double shifts scrubbing steel showers. What united our efforts—in addition, of course, to scrawny pay stubs and occupational hazards—was their transitory nature. Over three months, Ehrenreich tried various jobs that all paid less than $10 an hour, knowing full well she was a tourist in the penal colony of the working poor, whereas my hard labor might be regarded as either a stepping stone or even a rite of passage and, thus, a way station in the work world rather than a terminal.
In either case, it wasn't where we “belonged,” an invidious description that often sits comfortably on the tongues of those who tend to sentimentalize manual labor as conducive to “building character.” Certainly, in terms of a moral education, there is much to commend an extended engagement with hard, repetitive, and deeply unglamorous work, but only if the ultimate lesson more closely approximates “There but for the grace of God go I” than the bootstrapping bunkum of “Anyone can do it!” The enthusiasm of the second lesson treats such work as a kind of crucible of capitalist advancement, a challenging experience, to be sure, but one that individuals pass through on their way to bigger and better things—unless, of course, they suffer from some kind of personal failing.
The truth is that most tasks of a menial variety do not constitute “starter jobs.” They are just jobs, the work that people do to make ends meet. This is not to say that the individuals who work them don't want more, simply that the “more” in question tends to be “more pay.” They don't view the jobs they have as merely another rung on some endless career ladder. That idea is essential to the striver's ethic and, perhaps, necessary to progress in a competitive enterprise system, but it can create the impression that the waitress, the janitor, or the field hand—or, for that matter, the Uber driver, the floorwalker at Walmart, or a member of the invisible army packing boxes at Amazon—are undeserving of any larger consideration than the invisible hand already grants them.
In a world where one out of every 10 children died as infants, as they did in the US at the dawn of the 20th century, and no amount of money could replace a bum hip, suffering was far more democratic.
The danger of an ever-widening circle of wealth, the late John Kenneth Galbraith wrote in 1958 in his most famous book, The Affluent Society, is that “we will settle into a comfortable disregard for those excluded from its benefits and culture.” What’s more, he continued, “there is the likelihood that, as so often in the past, we will develop a doctrine to justify the neglect.”
Willful neglect of the working poor may be odious, but it does have the benefit of its being self-aware. If often giving little evidence of being especially informed, the opinions sanctioning such neglect, whether or not they congeal into a proper “doctrine,” are still susceptible to principled engagement and possible change. The greater danger, far greater, is a mere oblivion. Ehrenreich warns of it in the conclusion to her book. “Some odd optical property of our highly polarized and unequal society makes the poor almost invisible to their economic superiors,” she writes. “The poor can see the affluent easily enough—on television, for example, or on the covers of magazines. But the affluent rarely see the poor or, if they do catch sight of them in some public space, rarely know what they’re seeing.”
What accounts for such blindness? It may be due, in part, to the cradle-to-grave comforts that, especially in the United States, so many people enjoy today, a condition that distances us from an empathic appreciation of, or even exposure to, a workday consisting of drudgery.
What is drudgery? It’s not merely tedium or time-consuming tasks. Many first-year analysts at investment banks spend enough time staring at spreadsheets to long for the consolations of a jackhammer. No, if as a society we tend to valorize overwork and deprecate leisure in a manner that would have stunned our forebears after a summer day bent double in the fields, those who bumptiously embrace work-life imbalance aren’t at the head of the queue for sympathetic consideration, even if one might think they should seriously get their heads checked.
Drudgery, according to Galbraith, is work for which the only thing that commends it is a paycheck. “It is fatiguing or monotonous or, at a minimum, a source of no particular pleasure,” he says. “The reward rests not in the task but in the pay.”
In contrast to those who so labor, Galbraith described members of a “New Class,” a charmed group of individuals whose work included “exemption from manual toil; escape from boredom and confining and severe routine; the chance to spend one’s life in clean and physically comfortable surroundings; and some opportunity for applying one’s thoughts to the day’s work.” This class of individuals was “new” for Galbraith insofar as the ghosts of the Great Depression still haunted Eisenhower’s America, and it should be emphasized that what characterized these people as a class was not only that work was its own reward for them, but that it was rewarded, too, at least well enough to afford the upper-middle-class comforts and bourgeois peace of mind we associate with affluence.
Galbraith was born in 1908 and therefore was old enough to remember not only both World Wars and a Great Depression but also a time before child labor laws, central air, and the polio vaccine. He, like everyone else of his generation, was intimately familiar with a world that could be cruel and unforgiving regardless of one’s position in society. Cash and connections could no doubt insulate one from a considerable degree of grief and suffering, but in a world where one out of every 10 children died as infants, as they did in the US at the dawn of the 20th century, and no amount of money could replace a bum hip, suffering was far more democratic. No one escaped the gallows of despair.
But the world of 1950s America not only afforded the opportunities for a far more humane existence, there also seemed to be a growing community of people, Galbraith contended, who had escaped old Adam’s curse altogether. They could take for granted that work “will be enjoyable,” and if not, that it was a “legitimate source of dissatisfaction, even frustration.” It wasn’t so much that work need not be drudgery; it should not be drudgery. It should be an activity, however demanding, that is filled with personal satisfaction and even pleasure, a condition that involved the very nature of work as well as its reward.
Three generations now stand between us and the debut of Galbraith’s book, and affluence in the US has only grown. The “New Class” is not new anymore, and it’s far larger. Many people are several generations removed from familial touchstones of toil and economic turmoil, and for those fortunate enough to grow up in such blessed conditions, they live and work at a kind of experiential remove that makes a life of drudgery, and the mishaps of poverty that so often attend it, at best a kind of academic phenomenon, something one can identify and even opine on but hardly understand.
I see this in my own classes at Chicago Booth, disproportionately attended by young people who are familiar with upper-middle-class comforts. A statistic that seems to catch my students especially off guard is the median household income in the US, which, according to the Census Bureau, was just over $63,000 in 2018. Given that many of these students have already enjoyed annual salaries double this amount and have done so without also discharging any of the obligations of parenthood or even partnership, the number is a shock to them. (“People live on this?”) And immediately on the heels of this shock is the bracing realization that half of American households get by on less, often far less. Indeed, in 2018, for those under 65, the federal poverty line was set at individuals making $13,064 or less—or, for a family of four, a maximum income of $25,900. That year, 38 million people made the cut, a number slightly smaller than the combined population of the 23 smallest states.
To forget that someone is doing some terrible task on our behalf relieves us of the trouble of taking any responsibility for it.
“Most of the people I write about in this book do not have the luxury of rage,” David Shipler contends in the opening of his best-selling book, The Working Poor: Invisible in America. “They are caught in exhausting struggles. Their wages do not lift them far enough from poverty to improve their lives, and their lives, in turn, hold them back.” If such individuals do not have “the luxury of rage,” that is because luxuries of any sort, material or emotional, are well out of their reach. Whether they are entitled to feelings of rage, however—or, more to the point, whether those of us a bit more blessed should vicariously indulge them on their behalf—is a moral question, one with substantial social and political implications.
As an ethics professor, I have strong feelings on such matters, while also strongly believing that the lectern is not a pulpit. Students are welcome to come to any conclusions they like. They might, like Galbraith, resolve that, when people “cannot have what the larger community regards as the minimum necessary for decency,” they are therefore “degraded[,] for, in the literal sense, they live outside the grades or categories which the community regards as acceptable,” a belief that led him to conclude that “one of the central economic goals” of a civilized society is “to eliminate toil as a required economic institution.”
On the other hand, they might conclude that rage is only a proper response to manifest injustice, and insofar as the price of one’s labor is determined like the price of any good or service in a free market, by the intersection of supply and demand, if the market determines one’s labor is only worth $15,080 (the annual salary for someone working 40 hours a week for a full 52 weeks at a minimum-wage job), well, there’s nothing unjust about that. You are no more required to take such a job than I am required to pay you a penny more than I must for your services. In the spirit of Milton Friedman, we are both free to choose.
But choice, of either a moral or professional variety, is only meaningful if it is informed. That requires, at least among members of Galbraith’s “New Class,” a kind of curiosity about the lives of those less fortunate, a patient inquisitiveness that imparts a sense of moral perspicacity.
Such an engagement recalls an observation by George Orwell in The Road to Wigan Pier. “I had read the unemployment figures,” he wrote, “but I had no notion of what they implied.” Orwell meant that such figures were meaningless to him in a moral sense if he knew nothing about the lives they intimated. He set about remedying his ignorance by undertaking an experiential regimen that, if more involved, approximates the advice I give to my own students: seek to understand the lives of those who suffer on your behalf.
As he chronicled in The Road to Wigan Pier, Orwell went to observe the lives of coal miners in England, at least as they appeared in the mid-1930s. The second chapter of his book is extraordinary, a literary Grand Guignol of the grimmest working conditions. Orwell concluded it by reflecting on what had changed for the miners in recent years, and what remained the same for his readers, in addition to the connection between them. “It is not long since conditions in the mines were worse than they are now,” he wrote.
There are still living a few very old women who in their youth have worked underground, with the harness round their waists, and a chain that passed between their legs, crawling on all fours and dragging tubs of coal. They used to go on doing this even when they were pregnant. And even now, if coal could not be produced without pregnant women dragging it to and fro, I fancy we should let them do it rather than deprive ourselves of coal. But most of the time, of course, we should prefer to forget that they were doing it.
To forget that someone is doing some terrible task on our behalf relieves us of the trouble of taking any responsibility for it. The work is absorbed into some wonderful, mysterious world whose benefits, like manna from heaven, seem to appear from nowhere and drop right into our laps.
Ignorance is undoubtedly bliss, but it is always morally unbecoming.
John Paul Rollert is adjunct assistant professor of behavioral science at Chicago Booth.
A company that plugs its past can alienate those who were previously marginalized.
Celebrating Corporate History Can BackfireA survey of streaming shows about some famous (and infamous) tech executives.
The Bastard Children of Steve JobsThe Chicago Booth Review Podcast explores questions about the role of A.I. in society.
Do We Trust A.I. to Make the Right Decisions?Your Privacy
We want to demonstrate our commitment to your privacy. Please review Chicago Booth's privacy notice, which provides information explaining how and why we collect particular information when you visit our website.