This piece is adapted and condensed from Brent Cebul, “Liberated as Hell: The Autonomous Worker and the Hollowed Workplace,” The Hedgehog Review, Vol. 18, No. 1 (Spring 2016)
By the early 1990s, Jay Chiat had reached the pinnacle of the advertising world thanks to his firm’s iconic campaigns, especially the Absolut Vodka bottle print ads and Apple’s “Think Different” and “1984” spots. Flush with cash, Chiat commissioned the architect Frank Gehry to design “the office of the future.” Gehry produced the iconic (if mystifying) “Binoculars Building” in Venice, California. The structure’s whimsical exterior, however, clashed with the staid, cubicled ways of work going on inside. And so Chiat soon embarked on a mission to reorganize the ways employees worked at the Chiat/Day offices. His “virtual office,” a phrase Chiat popularized, would be as radical as the building’s shell. The quirky interiors would stimulate creativity and foster equality through open, non-hierarchical communal spaces sure to inspire imagination, collaboration, and flexibility — even playfulness.
The firm chose associate media director Monika Miller to test-drive the office, which meant she was freed of her desk, chair, and personal office space. Each morning, she signed out a computer and phone from her tech “concierge.” She bought a Radio Flyer wagon to tote her supplies and files. After scores of her coworkers were liberated, the office became a blur of motion, with staff continually visiting their cars where they stashed papers and pens. For the Chiat/Day ad agency’s “creatives” — pioneers in the now-ubiquitous world of the casualized workplace — liberation, Miller recalled, was “like a bad dream.”
Flexible workspaces quickly caught on because managers recognized a way to increase profits by reducing labor costs. Revoking personal office space meant less wasted space when workers traveled or were out sick. Employers like Booz Allen Hamilton, American Express, and IBM saw they could stuff more staff into smaller spaces. Even better from the employer’s point of view, flexible, reduced spaces enabled increased sub-contracting. Today, even the U.S. Government has become a “hot-desker.” The General Services Administration bases 3,400 workers in its flexible headquarters — which previously accommodated just 2,200.
It’s easy to understand these trends toward casualized workplaces as simply the result of capital’s inexorable thirst for greater profit margins. After all, the casualized workplace has emerged alongside the rise of the precarious economy, in which workers carry out “gigs” from home, coffee shops, or their cars. And as Americans’ workplaces have been hollowed out, their work-based system of security — severance, retirement income, health care — is eroded one gig at a time.
But Jay Chiat’s emphasis on creativity and flexibility — terms critics deride as the cultural tip of a profoundly destabilizing neoliberal spear — drew upon deeper cultural logics. Since the dawn of industrialization if not before, Americans have had a markedly ambivalent relationship with the ways of modern work. As scholars, reformers, theorists and policymakers envision new forms of social security and solidarity, they would do well to consider how the workplace operates as a cultural space within which workers make sense of their world and negotiate personal and interpersonal identities.
The search for personal creativity or flexibility in the workplace is hardly a recent phenomenon. Concerned by the individuality-crushing bureaucratic and routinized work of the white-collar office, research into creativity blossomed in the 1940s and 1950s. Concerned with fostering individualism, management theorists like Peter Drucker and Douglas McGregor drew upon Abraham Maslow’s psychological research and urged managers to find ways to meet workers’ “self-actualization needs.” As the Carnegie Corporation’s John Gardner put it, by 1963, creativity had become “an incantation. People think of it as a kind of wonder drug . . . and everybody wants a prescription.”
College students in the 1960s also evinced significant anxieties about the worlds of work into which they would be plunged. In a speech primarily remembered for its attack on the Vietnam War and the military industrial complex, the student-leader Paul Potter asked thousands of young people on the Washington Mall in 1965, “What kind of system is it . . . that creates faceless and terrible bureaucracies and makes those the place where people spend their lives and do their work?” In seeking to “Name the System,” Potter recognized it was students’ destiny to inhabit it — and to reform it or perish.
These students were not alone. In the 1960s, many Americans felt anxiety about the increasingly routinized and hierarchical workplace, whether in the factory or the skyscraper. A resurgent labor movement organized in the early 1970s, but many workers expressed their grievances in psychological rather than merely material terms. Their workplaces had been drained of intrinsic benefits and personal satisfaction. A much-discussed 1960 article described the forms of “psychological survival” that kept manufacturing workers on the line, as one attested, from “going nuts.” By the early 1970s, an autoworker lamented that “there’s no time for dreams and no energy for making them come true–and I’m not so sure anymore that it’s ever going to get better.” As social critic Daniel Bell put it in 1973, “however much an improvement there may have been in wage rates, pension conditions, supervision, and the like, the conditions of work themselves — the control of pacing, the assignments, the design and layout of work — are still outside the control of the worker.” A group of female accounting secretaries formed their own “local,” 9to5, in hopes of greater responsibilities and unraveling hierarchy. “I think I’d like to see more flexibility,” one worker reported, “I’d do away with job levels and just make everybody more equal.”
Wrenching shifts in the global economy furthered the moral and cultural transformation of work in the 1970s. In response, Americans embraced the therapeutic metaphors that had begun creeping into the zeitgeist as early as the 1950s. Psychologically infused self-help literature boomed in the 1970s: one tome, I’m OK, You’re OK, was a New York Times bestseller for nearly two years.
Against such alienating ways of work, Americans sought new paths to autonomy and authenticity. As the philosopher Charles Taylor saw it, “notions like self-fulfillment” and “the ideal of authenticity” started commanding a kind of “moral force” akin to earlier cultural imperatives like the Protestant ethic. As Taylor described it, “Being true to myself means being true to my own originality… In articulating it, I am also defining myself.” A rising generation of baby boomer “knowledge workers” embraced this morally infused individualism and developed a “multifaceted form of class-blindness,” as historian Lily Geismer aptly puts it.
This quest for personal originality and authenticity soon infused the cultures of Silicon Valley’s creative destruction and high finance’s corporate raiding. In the 1980s and 1990s, these worlds of work promised white-collar tech or finance workers many of the intrinsic psychological benefits an earlier generation had sought in creativity. Typifying this emerging culture of work and self, management guru Tom Peters spread a gospel of smashing bureaucracy, not adjusting workers to it. As he crowed in Liberation Management (1992), one innovator “demolished the corporate superstructure” “in about 100 days;” another was “decimating the central staff ranks.” Workers of the future, Peters predicted, would be “liberated as hell.” Peters fused notions of work, authenticity, and selfhood. Hierarchical structures and fixed spaces appeared to be wholly incompatible with the expression of the authentic self in the workplace.
However privileged they might be, many white-collar workers continue to evince ambivalence about modern work, and to struggle with its connection to selfhood. For instance, Google — the current occupant of Gehry’s Binoculars Building — suffers from a staggeringly low employee tenure rate, despite its almost fetishized status as a potential employer. This churn is an indication, perhaps, that once workers confront the firm’s demands on their time and energies, they burn out. Such workers swing from one extreme — close identification with a particular firm — to another. It’s a privileged form of precarity, to be sure, one in which job-hopping feels like independence and entrepreneurialism, a personally “authentic” solution to work that threatens to quash one’s vitality and originality.
A recent poll of freelancers found nearly nine in ten would decline a traditional full-time job if offered one. A Florida man values driving for Uber because it enables him to control his day (“I need to be able to drop the kids off at school and…take [them] to appointments”) and to focus on his passion: getting his “photography business up in full swing.” He is, perhaps, unaware of the precarious web of contingency and insecurity in which he is caught. Even so, we must acknowledge that many Americans from all walks of life — from the working class to hopscotching tech workers — find great intrinsic value in working for themselves. We must acknowledge it because this yearning for entrepreneurialism, authenticity, and independence is exactly what “platform” or gig entrepreneurs exploit.
Remediating our precarious era of work and security will require a full accounting: of structural economic changes, of technology, of politics. But we must also attend to culture — to what work means, especially in relation to individuals’ sense of self and their social commitments. Such a cultural accounting might ultimately offer a way to reconstitute a more inclusive system of rights and security than that which is under siege today. A new system might center values of authenticity and individualism by empowering Americans to pursue work not simply for material reasons, but also — and especially — for work’s intrinsic values.