Female warehouse worker sitting and looking tired

An inventory of inequality (2022) | Quality Stock Arts / Shutterstock


I know someone, a nurse, who doesn’t have health insurance. His employer, a staffing agency, bounces him from assignment to assignment—sometimes with only a day’s notice. Engaged in skilled care work that is profoundly dependent on the ability to maintain human relationships, he often finds himself treated more like a commodity than a care professional.

Contrast this with another example: while on a recent group run, a bubbly young woman described how her employer, an athletic gear company, supplies a stipend to support employee “wellness” on top of free entry into footraces. These perks were understood to be on top of the package of benefits (health insurance, paid leave) associated with many white-collar jobs. The runner was full of praise for her company’s largesse.

How would we characterize the difference in the way these two workers, both professionals, experience work and its role in their lives? Why is a job allowed to dictate one’s experience of living, and why do none of the recent tools to measure job quality directly acknowledge this? 

In 1990, Swedish social policy scholar Gøsta Esping-Andersen developed the concept of decommodification to describe how national social policies can insulate citizens from constant reliance on selling their labor power in order to live. Decommodified workers, Esping-Andersen found, can step away from work to take a paid sick day, a vacation, a sabbatical. They are secure in the knowledge that their value comes from more than their ceaseless productivity; that they will receive a livelihood and benefits, even in periods of non-work.   

In Esping-Andersen’s survey of welfare models under capitalism, decommodification is characteristic of social democratic regimes and their provision of universal social programs and labor protections. Though it varies cross-nationally, decommodification is stable within the same country, offering citizens a degree of shared experience. 

The United States offers few such social programs: no universal healthcare, no universal old-age pensions, no affordable higher education. Without these protections, work has become the site of decommodification for some and commodification for others. For some workers, a job is the primary source of access to non-work life and its necessities (care of the body and mind, time in which to rest and be creative, and the promise of future non-work). A set of conventions that emphasize job-as-relationship insulate these workers while on the job: long-term retention, salary, discretion over task performance. But for other Americans, work provides little or no decommodification. Rather, it actively commodifies them during work hours, through time-based pay and hiring for minimal time increments, lack of predictability, and surveillance. This distinction creates a separation in the life quality of American workers that is profoundly underestimated by the popular concept of “job quality.”


The COVID-19 pandemic occasioned a massive disruption in labor markets, and many workers, particularly those in the service economy, lost their jobs. But after the recovery, when employers wanted workers back, businesses struggled to hire and retain workers. This “Great Resignation” is expressed by labor economists through historically high quit rates (peaking at a never-before-seen 3 percent) in 2021 and 2022.

Suddenly, the workforce development field, where I work, was inundated with reports of an unfolding “crisis in job quality,” as well as a new crop of essays and toolkits produced by think tanks purported to define and measure the quality of work. 

The writings encourage employers to improve working conditions, stressing the benefits of a more motivated workforce. In its statement of purpose, the Aspen Institute’s Job Quality Center of Excellence calls on employers to provide “Working conditions that are safe, free from discrimination and harassment, and welcoming of workers’ concerns and ideas for improvement,” “Stable/predictable work hours,” and “A package of benefits that facilitate a healthy, stable life.” The Good Jobs Institute observes that “a good job needs to meet people’s basic needs and offer conditions for engagement and motivation.” Employers are encouraged to use the metrics to measure their progress in meeting criteria of job quality for employees.

None of these are bad ideas. But “job quality” is inadequate. First, the phrase suggests concern with a delimited area of a person’s life. But in talking to people in my circle—nurses, delivery drivers, teachers, bureaucrats—I soon observed that work impacts much more of a person’s life than is acknowledged by standard job quality metrics. It also became clear that the role of a job is seen differently by different kinds of workers. 

Some of the people I talked to viewed their job’s primary function as providing them access to the stuff of living. These tended to be fellow state workers and workers in white collar private sector jobs. Others saw their job as a necessary evil, a relationship that gave them back only as much as it took to keep them alive and working. These conversations weren’t just about the quality of a job, but its role in either providing or curtailing the quality of life.

Such differences are not captured by existing “job quality” metrics, which assume that workers in low-wage, low-benefit jobs simply need better access to skill training. The poor quality of jobs that lack certification or degree requirements is more or less acknowledged; if these fast-food workers, agricultural laborers, and retail clerks could only get trained and certified for “in-demand” jobs, the thinking goes, labor market inequality would be resolved. But take the example of healthcare, now the fastest-growing industry sector in the country. Most healthcare jobs require either a certification or a licensure (Certified Nursing Assistant or Licensed Vocational Nurse, to name a few). Yet despite being certified professionals in an expanding field, many healthcare workers face such negative work conditions that they either quit or are made sick by their jobs

Back to the example of my two acquaintances, the nurse and the athletic gear company employee: both individuals are professionals with requisite licensure or degree. Both fill some need (although caring for developmentally disabled adults and diabetic schoolchildren surely ranks higher on any hierarchy of societal priorities than marketing new running shoes). The issue is not one of pay alone. According to California’s Employment Development Department, Labor Market Information Division, nurses earn between $37 and $66 hourly. Yet in the first example, shift-based work pay structure and lack of benefits make the experiences of these two individuals miles apart. We need a way to account for the gravity of this situation. 

The second limitation of “job quality” is that it suggests a static experience. But the decommodification or commodification created by a job is generative. For instance, whether or not a job subsidizes training for the future marketable skills of its occupant (a benefit of many white-collar but few service jobs) creates or limits one’s future well-being. 

A system of private benefits favors certain kinds of workers over others, ordering and enforcing social inequality. This is demonstrated by the fact that only 48 percent of service occupation workers had access to any kind of retirement benefit in March 2023 as compared to 73 percent of workers overall. Paid sick leave was available to only 38 percent of workers in the lowest tenth percentile of average wages; in the highest tenth percentile of wages, 96 percent of workers had access to sick leave. Distribution of benefits also penalizes those unable to find full-time work: 82 percent of full-time workers had access to a retirement plan, while only 44 percent of their part-time worker peers did. Because many Americans work part-time not due to preference but because they are unable to find adequate hours, those who work only part-time because their employer does not provide sufficient hours are shut out of the opportunity for sufficient pay in the present—and are at the same time barred from an opportunity to receive a living from non-work in the future.


The origin of our fragmented situation can be traced to two historic moments. In the 1930s, the laws that created our early welfare state stopped short of creating universal programs that would have provided certain benefits, like healthcare or universal old-age pensions, as public goods. Instead, the state bound benefits to worker status. The paltriness of the resulting programs even encouraged growth of a private benefits system. Then, in the 1970s, changes in global production patterns led many producers to shift away from in-house production to a global supply chain model. Coupled with changes to employment law, this led to the replacement of full-time and regular employment models with part-time, temporary, and contracted employment. The change allowed employers to stop providing private social insurance; as political scientist Jacob Hacker has observed, the risks associated with maintaining a full-time workforce were consequently shifted onto individual workers. The business of cobbling together a living became a new form of unpaid labor. 

This shift in the burden of risk can be measured by the reduction in money spent on worker benefits and increases in employee cost-sharing, as well as in the proliferation of nonstandard employment in myriad forms. The model empowers employers to find creative strategies to avoid responsibility for keeping a human workforce alive and functional, such as “just-in-time” scheduling, which puts a burden on part-time workers, who often have little choice but to pick up last-minute shifts, and the use of on-call scheduling, where the employer does not provide compensation if the worker is not needed for a specific shift. Some workers are called up only to be sent home—without pay.


Jobs commodify or decommodify the people who hold them in two areas: by providing or not providing insulation from the constant need to demonstrate productivity at work, and by providing or not providing benefits that allow for the enjoyment of life outside of work.

On-the-clock decommodification offers workers discretion over performance of duties and the reassurance of being paid for the hours they work. And salaried workers experience a still greater level of decommodification than that of workers paid by shift or at a piece-rate; they are paid regardless of what they produce while the latter are paid only for each unit of productivity. 

The second category of decommodification, those benefits that take place “outside” work, includes access to voluntary non-work time in the present (time for recreation, rest, and creativity); access to emergency non-work time; the promise of non-work in the future (retirement savings plan); and provision for the care of one’s body and mind (paid healthcare). 

American social policy has so thoroughly outsourced responsibility for the necessities of life to the employment relationship that it has delegated the very strategy—decommodificationthat welfare states are supposed to use to protect workers from the vicissitudes of markets. The provision of a means of living and a human experience on the job is left to the discretion of employers, who base their decision on what they think their competitors are doing. In this landscape, the opportunity to be decommodified is a “perk” for the full-time, regularly employed, white-collar subset of the labor force. 

How can we share a meaningful sense of national identity when our jobs confer rights on some and actively take them away from others? We need universal programs, not competition between employers, to ensure that all Americans are protected from being conflated with the commodities we are paid to produce.