In my post on “wellness,” I wrote about the concept’s origins in the countercultural currents of post-war California. Like the “maker” culture that caught on around the same time, and the Arts and Crafts movement even further back, “wellness” offered a broad social critique—in the case of US-based wellness, of the for-profit health care system—ungrounded in a broad social and economic movement. Therefore, it was easily co-opted by shady hucksters and insurance companies who could market its libertarian spirit.
News reports of the measles outbreaks in Arizona and California reveal some overlap between the claims of wellness entrepreneurs and the arguments (such as they are) against vaccination. Two quotes from this New York Times article on the anti-vax response to the measles outbreak reflect the same fuzzy combination of a skeptical individualism and a resolute fixation on the self. The Times‘ reporters, Jack Healy and Michael Paulson, interviewed anti-vaccination parents in Lagunitas, CA, a town in the wealthy Marin County heartland of “wellness” ideology:
Kelly McMenimen, a Lagunitas parent, said she “meditated on it a lot” before deciding not to vaccinate her son Tobias, 8, against even “deadly or deforming diseases.” She said she did not want “so many toxins” entering the slender body of a bright-eyed boy who loves math and geography.
Tobias has endured chickenpox and whooping cough, though Ms. McMenimen said the latter seemed more like a common cold. She considered a tetanus shot after he cut himself on a wire fence but decided against it: “He has such a strong immune system.”
As Ciel Lorenzen, a massage therapist, picked up her children, Rio, 10, and Athena, 7, at Lagunitas Elementary, she defended her choice to not vaccinate either of them, even as health and school officials urged a different course.
“It’s good to explore alternatives rather than go with the panic of everyone around you,” she said. “Vaccines don’t feel right for me and my family.”
Like wellness entrepreneurs, anti-vaccination believers like the parents above treat health as a purely individual matter (it’s not right “for me and my family,” says Ciel Lorenzen). They are fixated on the self as a private fortress to be secured from “toxins” and other interference (unless the “toxin” in question is, say, polio, in which case: let’s roll the dice).
Relatedly, the language of “choice” in anti-vaccination discourse reflects an insidiously entrepreneurial approach to health at odds with the publichealth argument for—and New Deal origins of—childhood vaccines: to give all children the freedom to live lives free of crippling diseases. California treats it that way, too, offering what it calls a “personal beliefs exemption” to parents who don’t want to participate in the public campaign against disease. Parents like Lorenzen seem to view the decision not to immunize as belonging to the same order as painting their house a loud color—something that might, at worst, offend the local homeowners’ association but is otherwise a private matter.
Finally, anti-vaccination inherits from wellness its mixture of skepticism (of health bureaucracies and credentialed experts, mostly) and credulousness (of self-proclaimed experts, mostly). I don’t know if one can call anti-vaxxers “anti-science,” since they are committed to “science” in theory and in their forms of argument. Nor would I just dismiss the parents above as simply stupid (although they may well be stupid). Anti-vaccination reflects a deeper political problem—the individualization of our obligations to one another and the commercialization of what should be our right to live free from preventable disease.
“I cannot excuse you that profess to be my friend and yet are content to let me live in such ignorance, write to me every week, and yet never send me any of the new phrases of the town…Pray what is meant by wellness and unwellness?”
“Wellness” dates to the 17th century, but it is a staple of contemporary health discourse, from dietary-supplement entrepreneurs and human-resources departments to the high-end beauty and exercise industries. The word’s transformation is nearly captured in the two meanings the Oxford English Dictionary gives. “Wellness” has evolved from a simple contrastive quality—that is, as the opposite of illness—to a positive one, a state of “good physical, mental, and spiritual health, esp. as an actively pursued goal.”
The problem raised by this distinction, of course, is “good physical, mental, and spiritual health” is much harder to quantify or even to ascertain without its opposite. An 1887 Unitarian newspaper used “wellness” in its original sense, “the absence of sickness,” in a way that anticipates the contemporary meaning—the editorial asked the reader to pray for the “spiritual paralytics” that are prostrated on “sofas of wellness.” The joke, here, is that comfort can be its own anxious, incurable affliction. Wellness, in its former meaning, was at least an achievable objective: you could be not sick. Wellness as an “actively pursued goal,” however, is not only elusive but easily and boundlessly monetizable.
From Unity: Freedom, Fellowship, and Character in Religion, September 24, 1887
6 decades later, Herbert Dunn, a physician at the U.S. Office for Vital Statistics, coined the new term in a series of lectures delivered at a Unitarian church in suburban Washington, D.C. In “High-level Wellness for Man and Society,” published in the American Journal of Public Health in June 1959, Dunn argued that the medical profession’s dichotomous emphasis on the prevention of illness (rather than the promotion of wellness) had ignored “a fascinating and ever-changing panorama of life itself, inviting exploration of its every dimension.” “Wellness” is Dunn’s term for this panorama, a combination of spiritual, psychological, and physiological health.
With “wellness,” Dunn aims to restore to the self a sense of integrity lacking, he believed, in modern society in general, and medical bureaucracies in particular. Dunn accordingly understands wellness as an expression of two underlying forces within the self. Firstly, it is nourished by what he calls “the creative spirit,” an “expression of self” he understands in completely individual, and inward-looking, terms. “With creative expression,” he writes, “comes intense inner satisfaction.” Firstly, in a move that appears to anticipate the New Age or Orientalist trappings of later wellness doctrines grounded in broad claims about “eastern” religious practices, Dunn describes wellness as a harmonious blending of body and spirit. He attributes the medical profession’s inability to move beyond the sickness/unsickness duality to a failing of “Western culture,” our cleavage of the physiological and the spiritual.
As if we could divide the sum total of man thus! … In fact, the essence of the task ahead might well be to fashion a rational bridge between the biological nature of man and the spirit of man-the spirit being that intangible something that transcends physiology and psychology.
The task is especially urgent, Dunn explains, because in a world whose population is getting older, larger, more alienated, more neurotic, and more desperate for ever-dwindling resources, “it is probably a fallacy for us to assume, as so many of us have done, that an expansion in scientific knowledge can indefinitely counterbalance the rapidly dwindling natural resources of the globe.” Wellness, despite its current reputation for sunny optimism—thanks to its association with northern California, Oprah, yogurt, etc.—emerges out a Cold War moment of pessimism about modernity. In the most ambitious part of the article, when Dunn proposes the development of wellness metrics, the concept begins to sound like it belongs in a Philip K. Dick dystopia.
Select groups of people who are disease-free and who are making full use of their talents, capacities, and potentialities; then measure them by biochemical, functional, and psychological tests to establish the characteristics of those enjoying a high level of wellness. Such groups would need to be selected so as to be representative of the various ages, sexes, and racial combinations.
Measuring wellness will be a challenge, Dunn concedes, given the abstract nature of the concept. “Since the nature of this goal is ever changing and ever expanding, we will probably never reach it in absolute terms,” writes Dunn, somewhat apologetically—but it is this aspect of wellness that has led to its enduring popularity.
“Peak wellness: the extreme opposite of death,” from Dunn’s “High-Level Wellness”
“Wellness” in its contemporary form was popularized by Dr. John Travis, a physician whose Wellness Resource Center, founded in 1975 in Marin County, CA, retained Dunn’s emphasis on personal autonomy and responsibility, his critique of conventional medical care, and his interest in spiritual or psychic well-being, but discarded his pessimism. “Wellness isn’t a term you hear everyday,” said Dan Rather in a 1979 60 Minutes report often credited with mainstreaming Travis’ still-countercultural Center. Wellness is “self-care” and an “an ongoing state of growth” both physical and spiritual. We don’t prescribe, says Dr. Travis: “our goal is to help the person discover why they are sick.”
As an “ongoing state of growth,” wellness is a receding horizon, which brings us back to the two OED definitions. While it’s possible to say, or at least to feel, that you are no longer ill, you can always be more “well” than you are. This intersects easily and perhaps insidiously with the emphasis on “self-care,” enhancing the stress wellness is supposed to address, as we become ever more anxious about our failure to become less anxious. Wellness discourse, especially once it migrated into the arsenal of “human resources” departments, emphasized personal responsibility for health. It was up to the employee—and sometimes required of her—to quit smoking, drinking, sleeping around, and eating Doritos.
Dr. John Travis, in the 60 Minutes report on the Wellness Resource Center
The rhetoric of personal autonomy in “wellness” culture recalls the origins of “DIY” and “maker” culture in northern California around the same time, and one can argue that it has suffered a similar fate. In Rather’s 60 Minutes report, he asked a group at the Wellness Resource Center to respond to criticism that wellness was a “a middle-class cult” (this was the 70s, after all). One woman responded that if wellness was a cult, it was one in which “you’re the leader, you’re your own guru.”
Captured by corporations eager to appropriate this libertarian spirit, “wellness” became a niche market for middle-class consumers. The psychologist Lotte Marcus discussed its ascendant popularity in a 1991 Mother Jones article, “Therapy Junkies,” that attributed what she calls the “cult of wellness” to the Reagan era, which as we have seen is not exactly true, but makes political sense given the way in which wellness’ emphasis on “personal responsibility” dovetailed with the Reaganite use of this same moral vocabulary. Linking wellness to the then-popular issue of workplace “stress,” Marcus decries
telegenic wizards, live seers, and mail-order curanderos ready to ward off whatever phantom villains or microbes may be diagnosed as the root cause of a long-standing complaint, by prescribing a remedial regiment, say, of diet, ritual recantation, positivist thought, the use of do-it-yourself improvement kits, and the purging of a variety of ‘toxins’—social, familial, nutritional, and astral. In this way, sufferers…are often condemned to living perpetual reruns of their roles as victims. As a result, they’re sometimes driven to prolong their pain and anguish, beyond the point where it ought, humanly, to be borne.
The fact that you can never definitively achieve “wellness” makes it an industry open for theoretically limitless expansion into more high-end niches, as the Global Spas Summit’s 2010 report, “Spas and the Global Wellness Market: Synergies and Opportunities” argued. Wellness is popular among affluent consumers who value “sustainability, authenticity, and local sources,” the GSS announced, and this core market
positions the spa industry as one of the most logical sectors to take advantage of (and help lead) the wellness movement. Wellness also provides an opportunity to reshape the image of spa, to regroup after the global recession, and to position spa as an investment or an essential element in maintaining a healthy lifestyle.
[sic: there are no articles in front of “spa.” It’s not “the spa,” “a spa,” or “spas.” There is only “spa.”]
Finally, most private health insurance now includes some kind of employer-sponsored “wellness” program, which combines cash bonuses, prizes, or insurance discounts with gamified nutrition and exercise programs (my own employer offers “wellness bucks” in return for meeting certain benchmarks. Quitting smoking will get you a travel mug). “A company’s most precious resource is its employees,” writes well-known health charlatan Mehmet Oz in Oprah Magazine about such programs. Despite the obvious fallacy in this sentence—an oil company’s most precious resource, for example, is obviously oil, since unlike employees, it is a non-renewable resource—Oz makes a valid point here. Employees’ health isn’t valuable because workers’ lives outside the workplace are important—the reason why labor movements struggled for occupational safety standards and health insurance in the first place. Rather, wellness matters because employees’ health and happiness will maximize their productivity at work.
Harvard Business Review confirmed Oz’s theory in a study that showed employee wellness programs were, in fact, worth the investment: “healthy employees cost you less” (“you,” here, are the managers to whom HBR is addressed) by minimizing insurance costs. Other benefits to the firm were less measurable though no less important, the report argued. And here, despite wellness’ long “journey,” as they say in the wellness biz, from its coinage in 1959, we come circling back in unlikely fashion to Dunn’s original theory that wellness’ most lasting benefits were the intangible ones of “spirit.” HBR pointed out that employee wellness programs don’t just make employees more inexpensive, but also more loyal, a quality hard to measure but impossible to overvalue: “investment in wellness can, when executed appropriately, create deep bonds,” a theory they illustrated with the following, unsettling anecdote:
When MD Anderson initiated its wellness program, president John Mendelsohn took walks throughout the building with wellness coach Bill Baun. For many, it was the first time the president had been in their work space or had shaken their hand, and he tended to start conversations with “How’s your wellness?”
In this sense, wellness programs might help the modern firm finally resolve what Dunn, wellness’ original prophet, described as the “western” schism of body and spirit: both now belong to your boss.
Steve Scalise, the ranking member of Congress exposed this week as a white supremacist, made an “error in judgment” in speaking to the European-American Unity and Rights Organization in 2002 (also known as EURO—what kind of amateur-hour nativists are these, anyway?) What’s more, it was “inappropriate.”
This is what is so laughable about describing Scalise’s apparent white-supremacist sympathies this way—he’s a politician, and attending a white-nationalist conference is an actual political decision, taken deliberately by a state congressman, not some impulsive act by a wayward youth.
What makes it stranger is that Boehner’s full statement of support for his comrade in the Congress also describes his attendance as “inappropriate,” which is the vocabulary usually reserved in political journalism and public relations for sexual indiscretions. “Errors” that are “inappropriate” are usually sexual in nature: the euphemism is often used by moralistic politicians and puritanical preachers who admit to extramarital affairs, sexual harassment, and so on. It is particularly useful for homophobic politicians, like Larry Craig, who are caught in sex acts with men. Craig, the right-wing Idaho U.S. Senator, initially responded to reports that he solicited sex in an airport men’s room by insisting that he “was not involved in any inappropriate conduct,” denying the conduct by daring not to speak its name. The term “inappropriate” is also adopted by victims and by journalists speaking and writing publicly about abuse and harassment, either because its clinical and legal-ish detachment make it sound either less painful or more “objective.”
The personal terms in which Boehner both criticized and defended Scalise—as a man who has made “inappropriate” “errors,” but is yet a man of “integrity” and “good character”—make clear that his offense is not just personal, but ephemeral. An enthusiasm for Nazis is less scandalous, in fact, then enthusiasm for sex in a bathroom. So, if we hope to hold our political leaders accountable for attending white-supremacist conventions, we better hope they have sex with a prostitute while they’re there. Otherwise, who cares?
“Fail” is “commonly used as an interjection to point our a person’s mistake or shortcoming, often regardless of its magnitude,” according to Know Your Meme, an indispensable resource for such things. According to the Internet folk history documented there, “fail” as a noun is a formation from a verb usage in a notorious, badly translated 1998 Japanese video game Blazing Star. When a player died, the game flashed a screen reading: “You fail it! Your skill is not enough – See you next time – Bye-bye.” Its transformation from this mistranslated verb usage into a mass noun (parenting fail, transparency fail, so much fail, etc.) apparently followed.
For most of its history, the verb “to fail” has most often been the opposite of “succeed,” to “be absent or wanting of something desirable,” as the Oxford English Dictionary puts it. Its illustrious history stretches to 15 definitions there. Some of the intransitive meanings, like to “fall ill,” are somewhat dated but still around; others, like “to be wanting or deficient in,” comically approximate the modern slang usage on the web, especially when we read the OED’s sample sentences. Plato, in an 1877 translation, writes: “The Dialogue fails in unity.” Unity fail.
Failing is resonant in more “serious” corners of the media, as well. There is the foreign-policy intellectual hobbyhorse of the “failed state,” the passive voice doing a lot of work here to describe the extreme immiseration of nations that show up on the top of such lists: Ethiopia, Congo, Chad, Afghanistan. More recent is the celebration of “failure” in entrepreneurship discourse, where it is closely related to “innovation.” Here, failure is a veritable fountain of obvious metaphors. Out of failure springs innovation. Failure is innovation’s foundation. Failure drives innovation. It’s also the mother of innovation. Simply celebrating “failure” in the business world proves the point—so many business-magazine articles on “failure” are clearly delighted with themselves just for reaching this boldly counterintuitive conclusion. Failure scored a particularly insipid cover story by NPR’s Adam Davison in the New York Times Magazine, which breezed through the history of capitalism (it never says the word, of course) as a series of brilliant “innovations” and daring risks by bold heroes unafraid to tempt failure. Davidson decries the “proselytizing” associated with innovation. Indeed, innovation as he uses the term is not a deity one worships but more like a world-spirit of progress marching across the generations, giving us smartphones, the Constitution, and the eight-hour workday. Failure, or the ability to risk it, is its driving force—whether that “failure” means the loss of other people’s money, like with a smartphone, or your life, like those who fought for a shorter working day.
To be fair, some of these defenses of “failing” make logical sense, but in the same way as a daily affirmation you hang in your office: you can’t succeed if you are afraid to fail, etc. Davidson’s warning about proselytizing aside, the above treatments of “failure” are entirely in keeping with the moralism that underlies the cult of entrepreneurship and which pervades so many of our other austerity keywords. For “entrepreneurship” ideologues, failure fits into this moralistic framework, with its celebration of lonely sacrifice and self-reliance. One must be purified in the fires of bankruptcy before finding true success; only after you have wandered through the wilderness of failure can you develop the “resilience” to ascend the mountaintop of innovation. One such anecdote describes a failed Silicon Valley entrepreneur who failed before starting a new company called…E.piphany. Entrepreneurship failure stories, like conversion narratives, are always individualized like this, just as its success anecdotes lend themselves to hagiographicleadershipcults.
Back to the Internet: there, “fail” is always used ironically, never used with the reverence that characterizes so much entrepreneurship rhetoric. One of these ironies is fail’s opposite, “win.” Why not “success” or “victory”? The answer, I think, is that both “failing” and “winning” ironize the competitiveness and atomization that are built into both the culture of social media and the cult of entrepreneurship. I’ve always thought the mass noun “fail” was funny in part because it sounds (to me) like a misapplied computing term rather than a mistranslation—a computer error, server failure, etc. Extending this to the “real,” 1.0 world—the factory foreman slips on a banana peel, epic fail!—ironizes the grandiose, world-spanning Internet as a humble and intrinsically funny object. Other examples of Internet-irony: the phrase “You win one internet,” dispensed as praise for Facebook bons mots; The Internet for Men, a real, off-brand cologne sold on the streets of Chicago in the late 1990s; or those YouTube videos of Bryant Gumbel befuddled by “internet” on the Today Show in 1994.
The ungrammatical use of “win,” on the other hand, ironizes the social ideal of “success,” entrepreneurial or otherwise, treating this as a game, and therefore either 1) rigged or 2) trivial, since the things as which one “wins” online are mostly unremunerative and fleeting: Facebook likes, an argument with a stupid stranger, Twitter followers, etc. At the same time, the pursuit of the epic win has the same sense of ruthless competition and “disruptive” striving that, as we know, is the stuff of which true entrepreneurs are made.
“We are pleased to present the ninth annual Failed States Index” (fundforpeace.org)
If the opposite of “fail” is “win,” what is the opposite of “failed,” as in “failed state”? The question is never asked, of course, since the concept assumes as the normative standard of development the countries where the concept originates. (Obviously, to say that the United States and Britain have “won” development would be to admit that the whole business is a conflict, rather than a shared endeavor, and we mustn’t think that.) But failing has an obvious, common educational meaning. So if D.R. Congo is a “failed state,” maybe we should think of the United States of today as a Gentlemen’s C State: entitled and careless, coasting off the prestige of its parents.
Worse than being afraid to fail, as the entrepreneurship ideologists put it, is the inability to recognize if and how you have already failed. Self-awareness fail, all around.
On Detroit Future City and the limits of so-called “participatory planning.”
“Community engagement” and “civic engagement” are phrases that first appear in English sometime in the mid-1950s, according to Google’s ngram database. Before then, one might naively assume, there was no need for the thing, like the cliché about camels not showing up in the Koran, and so the concept became popular once everyone noticed it missing. Yet the basic problem—political atomization and fraying community ties—is not new or unique to our times. What particular meaning, then, does the word “engagement” have for us now?
The term “civic engagement” is usually attributed to Robert Putnam’s influential 1995 Bowling Alone: The Collapse and Revival of American Community, which argued that American civic life had deteriorated since roughly the 1950s, the dawn of “community engagement” as a term in print. Putnam’s basic argument was that Americans had become less likely to join community organizations, from the PTA to the Elks to neighborhood bowling leagues, and more likely to join passively when they do, by writing an annual check or signing a petition.
The argument, at least in these broad strokes, isn’t new. The sociologists Robert Staughton Lynd and Helen Merrel Lynd discussed the problems of “apathy,” “standardization,” and “isolation” in Muncie, IN in their famous book 192X book Middletown: A Study in Modern American Culture. For them, “apathy” referred to a lack of class consciousness and political participation, while “standardization” and “isolation” pointed to the regularization or deterioration of social forms of leisure. So back in your Granddaddy’s day, the neighbors always used to visit more.
Putnam and others in the sociology and management studies world use civic engagement roughly synonymously with “social capital,” a dreadful phrase that itself encapsulates a broader trend many of these keywords document: the quantification and commodification of both virtue and social life. One goes to church for salvation, fellowship, and “social capital,” and not necessarily in that order.
“Social capital,” says the World Bank, “refers to the institutions, relationships, and norms that shape the quality and quantity of a society’s social interactions.” Meanwhile, the Kennedy School of Government—I’m still waiting for my job interview—writes:
Social capital refers to the collective value of all “social networks” [who people know] and the inclinations that arise from these networks to do things for each other [“norms of reciprocity”]…The term social capital emphasizes not just warm and cuddly feelings [No, it doesn’t—ed.] but a wide variety of quite specific benefits that flow from the trust, reciprocity, information, and cooperation associated with social networks. Social capital creates value for the people who are connected and, at least sometimes, for bystanders as well.
It’s an example of the proliferation of metaphorical “capital” (which supposedly “produces” metaphorical “value”) in academic and non-profit jargon. (“Human capital” is especially odious for the way in which it unwittingly recapitulates the traffic in humans as capital.)
These may be an attempt, as the sociologist Claude Fischer suggests, to claim for the softer social sciences the prestige of economists. It also reflects the way in which formerly informal, individual, or creative forms of work, thought, and life—simply everyday neighborliness, formal cooperation, creativity itself—can be mined for whatever “value” they hold. So civic or community engagement is, in this sense, a measurable assessment of social interaction. The “engagement of the community,” as one often hears, refers to a community’s participation in its affairs. (I often use the word as a synonym for “participation” on my syllabi, with the idea that it emphasizes a collective responsibility for a class’ success).
“To engage” is also used to connote some action taken withthe public, as in “to engage the community in x.”
As with “innovation,” the noun is often used abstractly, with no clear object of the engagement nor subject doing the engaging. Take, for or example, the mission statement of the grant-making Knight Foundation, which funds arts and other projects in Detroit to promote “community success by supporting civic innovation, founded on robust civic engagement.”
The need for more “robust engagement” presumes, of course, that there isn’t enough of it, and perhaps that there used to be more, once. Why that might be is a longer story that depends on how you define the concept, but I think it’s no accident that “engagement” thrives in a moment of polarization and demobilization. The word and its usage simultaneously expresses a democratic faith in collective participation and a hierachical faith in individualization, a contradictory combination that a more detailed reading of its use might clarify.
The Oxford English Dictionary gives a broad set of definitions for “engage” and “engagement,” but all of them involve some sense of a formal covenant, a contract, or a conflict: the marriage contract, some other legal agreement, or a military “engagement” between armies. (It is this last meaning, of course, that Capt. Picard is drawing upon.)
However, it’s hard to find one that suits the way many civic institutions routinely use the word, as participation or “buy-in.” The participatory idealism that Putnam’s “civic engagement” summons for us belongs to obsolete meanings: “the fact of being entangled,” for example, last common in the 16th century. Others point to the talents of charm—“to gain, win over, attach by pleasing qualities”—a kind of seduction. “If you engage his heart,” wrote the Earl of Chesterfield in 1751, “you have a fair chance for imposing upon his understanding.” “To engage” here has an explicitly social yet also manipulative meaning.
For a specific example, let’s look at Detroit Future City, the strategic framework for the city’s planners, released in 2012 after 4 years of research. The project drew on the technical expertise of urban planners from around the country and was shaped decisively, or so the report claims, by the “engagement” of average Detroiters. The plan’s basic premise is that given Detroit’s vast surplus of idle public land, property development and the attraction of entrepreneurial investment will be the material basis of the future city. From the beginning of the report, one can read an anxiety about anticipated criticism of the urban planning process by a citizenry with a deep historical memory of destructive, racist urban renewal initiatives of the past. Words like “collaborative,” “collective,” and “engagement” appear throughout the report. The first paragraph emphasizes that the project “has been a collective journey, inviting diverse input from technical experts within Detroit and around the world and, most importantly, the community experts and everyday citizens.”
For an example of what the “strategic vision” of the DFC seems to define itself against, see this excerpt from a film that has circulated widely online. Detroit: A City on the Move was produced in 1965 in a clumsy predecessor of the more successful city-branding campaigns pioneered by New York City a decade later. In the section below, narrated by the doomed liberal mayor, Jerry Cavanaugh, urban planners emerge as the stars of the show, manipulating buildings around a scale model of the city like chess players arranging their moves.
Note the allusion, first of all, to a “resurgence” and a “renaissance,” an acknowledgment of Detroit’s post-war economic slump and a reminder to local reporters and headline writers that skepticism of “rebirth” metaphors should really be one’s default position when writing about the city.“Planning with a Purpose” is the credo: efficiency and progress, not engagement and innovation, are the bywords here. So in the move from grandiose “renaissances” to humbler “revivals,” from linear “progress” to speculative “innovation,” from the birds-eye view of “efficiency” down to street-level “engagement,” we can see the retreat from the “master plan,” with its top-down initiatives and illusions of control. But a retreat to what?
Detroit Future City proposed a more inclusive planning process with this history in mind. My friend Joshua Akers, a geographer at U-M Dearborn, recounts how embattled and stage-managed the “engagement” process was in a forthcoming article. After the first meetings drew large crowds angry about poor city services, later events were carefully stage-managed: emcees ran the floor, and audience participation was technologized in various ways. Attendees were given clickers that allowed them to “vote” on choices presented to them: did they think that education, for example, was their neighborhood’s greatest priority, or was it jobs or transportation? (There was no button for all of the above). There were in-person film booths where residents could record messages, and an interactive video game, called Detroit 24/7, which is regrettably no longer available to play online.
Regardless of the intentions of the project’s organizers—and the move away from authoritarian models of imposed reform seems laudable enough—what happened in practice is that community “engagement” is managed to suit expert models, rather than the expert models being shaped by popular participation. In the final report, community comments appear regularly in the form of pull-quotes, usually designed as a speech bubble from some silhouetted figure. See, for example, this comment on two of the report’s neighborhood typologies:
“Green Residential” and “Live+Make” are not Alexandra’s terms but the planners’. The mysterious Alexandra is simply saying whether they seem like a good idea or not.
Engagement as “participation” suggests a dynamic, two-way exchange, but Lord Chesterfield’s motivated seduction is clear in the DFC’s own explanation of their term: “Why engage?” the authors ask. You might wonder why a planning document of an ostensibly democratic polity even has to ask, but here’s what they say:
Civic engagement yields lasting benefits. This is true of any development endeavor or long-term initiative, including the Detroit Strategic Framework. Here’s why: first, civic engagement helps strengthen and expand the base of support for a given effort. More people become informed, activated and mobilized through engagement efforts. Opposition is less likely because concerns are addressed within the process. […]
Lastly, and perhaps most significantly for the Strategic Framework, civic engagement actually improves the substance or content of an initiative. An effort that has been supported by civic engagement will more accurately reflect the ideas of the people it affects… (327)
What is rather bluntly acknowledged here is that engagement is valuable because it blunts opposition and strengthens support—in other words, it’s political and performative. What is important is that it feels rewarding to an audience. While the report concludes by arguing that “engagement” improves the work, it’s hard not to read this second paragraph as a mere gesture, with the intentions laid out in the first.
The point of engagement in this sense is not to involve the public in making decisions, but make them feel involved in decisions that others will make. That this may be done with the best of intentions is important, of course, but ultimately besides the point. Like “stakeholder,” “engagement” thrives in a moment of political alienation and offers a vocabulary of collaboration in response. So if civic engagement is in decline, one thing that is not is the ritualistic performance of civic participation. The annual election-cycle ritual in American politics is a case in point here. In one populist breath, we routinely condemn the corruption of politicians who, it is said, never listen to the average voter. And in the next, we harangue the average voter for failing to participate in a process we routinely describe as corrupted. So it’s not the “apathy” or “disengagement” of the public that we should lament or criticize—it’s the institutions that give them so many reasons to be disengaged in the first place.
Engagement and participation: defined here as informing the public about the project
In its editorial on Evo Morales’ re-election victory in Bolivia last week, the New York Times described Morales as one of a group of “new caudillos” threatening “democratic values,” united in their desire to “appoint allies to electoral and judicial bodies and to build patronage networks that turn out the vote,” “weaken institutions,” and assert “greater control over the press.” (Note the evasive comparative adjective in this last—greater than what?). Glenn Greenwald has already observed how “democracy” here is little more than a code word for U.S. power. “Meanwhile,” Greenwald concludes, quoting the editorial, “the very popular, democratically elected leader of Bolivia is a grave menace to democratic values – because he’s ‘dismal for Washington’s influence in the region.’”
Any hand-wringing about “democracy” in Latin America should of course remind readers of Latin America’s Cold War, when the most horrific mass cruelties were justified in its name. And “caudillo’s” popularity shows the durability of Cold War terminology. It’s always used in English media to signal to an audience the author’s historical seriousness and command of the subject (“Well, in Latin America, you see, they have a term, ‘caudillo,’—you know, it’s pronounced COW-DEE-YO”]. The Times’ editorial is a textbook example of this usage. At its best, the word is simply a pretentious misreading of Latin American history, and at its worst, an ethnic slur. Yet there is something strange about the persistence of the word “caudillo,” if only because it belongs to both the Spanish language and to history, and U.S. journalism is so often incurious about both.
The Spanish Fascist Francisco Franco, whose rule coincided with the term’s revival in English media in the 1930s, was “El Caudillo,” the caudillo to end all caudillos. A casual search of the Times’ archives shows how“caudillo” is always used in either a lapsarian or apocalyptic sense. Last week’s editorial was not the first to ward off, fingers crossed, the rise of the “new caudillo”; one reads always of the “last caudillo” or the “return” of the caudillo. Debates raged about who was the “worst caudillo” ever in the Dominican Republic. Michele Bachelet of Chile, who is a woman and therefore not a caudillo, took power, naturally, “after the caudillo.”
Woody Allen as a bearded caudillo in Bananas!
The term “caudillo” is by now so saturated with Anglo-American stereotypes of the Latin American macho that it is useless as a meaningful term to describe anything except typical U.S. misconceptions of Latin America. Often translated as “strongman,” the origins of the political form of caudillismo are in the tumultuous post-Independence period in South America, where regional political or paramilitary bosses asserted control over provincial territory in a weakened central state. For this reason, continuing to call Evo Morales or Nicolas Maduro “caudillos” seems like the equivalent of calling Barack Obama a “robber baron” when he bypasses Congress. In other words, even if there is a grain of truth in the analogy, the popularity of “caudillo” shows how when discussing Latin American politics the recourse to tropes of “backwardness” is nearly irresistible. A U.S. president’s faults are of his own time; on the other hand, Latin Americans are always battling back the primitive past that lives in their midst and in their heads.
The New York Times’ reference to Bolivia’s allegedly degrading “democratic values” is a clue to the term’s basically culturalist meaning. For all the editorial’s lawyerly talk of “institutions” what we’re really talking about here is an authoritarian cultural predisposition, native to the soil south of the Rio Grande. Why else use the Spanish word to describe what is essentially party politics everywhere else? The term “caudillo” suggests that this is a political form peculiar to Spanish America, but by their own definition, the New York Times could use it for any head of a political machine anywhere. Is Michael Bloomberg, who revoked the city’s term limits to serve three terms, New York’s “last caudillo”?
The New York Times, October 9, 1932
Note that this doesn’t foreclose Latin Americans and Latin Americanist scholars from using the term. For example, see the anti-Chavez Venezuelan political scientist Javier Corrales’ coinage of the term neocaudillismo (note, again, how caudillismo is always renewing itself). Corrales shows the difficulties of trying to forge a coherent political theory out of what is essentially an ethnic stereotype. In his estimation, neocaudillismo includes both newcomers and political outsiders (like Evo Morales and Hugo Chávez in Bolivia) and ex-presidents returning to office (Alán García in Peru and Carlos Menem in Argentina), so one wonders who it excludes. “Latin America is still the land of caudillos,” Corrales concludes, in a sentence that sounds like it was written a half-century ago. “These new caudillos may not promote coups, insurrections, or totalitarianism, but they still weaken parties, erode checks and balances, and scare adversaries.”
The popularity of “caudillo” signals the obsession with “political institutions” in foreign policy and development literature as the meaningful instrument and measure of progress. And progress, of course, has usually meant imitation of the United States, where “political institutions” are of course strong, fair, and non-partisan, except when they aren’t. And this is why, for me, there is this odd tone of familiarity with the caudillo in the U.S. press and academia—I can’t think of an equivalent “insider” term in foreign-policy coverage of other parts of the “third world.” Saudi Arabia and Cambodia, say, are so unfamiliar that they must be translated. But we know Latin America, the term seems to suggest, and we how it could be so much more like the United States, if only it could vanquish the partisan caudillos who are always reappearing, have just reappeared, or have just been vanquished, only to reappear again—one more instance of what Martí called “the scorn of our formidable neighbor who does not know us.”
A New York Times op-ed written by Jayne Merkel, an architecture critic, argues that the New York City Housing Authority could address its vast backlog of unfinished repairs—caused by the long-term cuts in federal funding—by training residents to make their own repairs. She calls this “A DIY Fix for Public Housing.”
The argument rests on a couple of obvious major fallacies. As with so many of our keywords, it values individual derring-do and ignores structural forces, resulting in the apolitical assumption that closing the federal funding gap is impossible, and thus “arguing over who will make nonexistent repairs is fruitless.” (One could borrow this logic to dismiss any political demand that seems, as most important ones do, unrealistic: “arguing about how women will exercise their nonexistent franchise is fruitless,” “arguing about taxation with nonexistent representation is fruitless,” and so on.). Second is its confidence that “almost anyone can replaster a wall.” (No.)
Reader Barbara A. Knecht of New York City already pointed out the idea’s other problems, in a letter to the editor so sensible one wonders how it slipped by the editors:
…[F]or the cost and time to develop, administer and insure a training program, the authority could employ and deploy the trainers to make repairs.
…Would the same recommendation hold for the residents of a Park Avenue rental building with a noncompliant landlord? Housing authority tenants pay rent and have a right to expect their landlords to keep up their end of the contract.
Knecht points out something both very old in the history of “do-it-yourself,” and something very new in its recent appropriation as a term of austerity individualism. Informal and inexpert by nature, straddling work and leisure, DIY has never been a strict necessity: you don’t just “do it yourself” because you have to, but also, and sometimes mostly, because you want to. This informality obviously makes it a poor solution for an affordable housing crisis.
Popular Mechanics, Jul. 1960.
What seems new in Merkel’s use of DIY is the migration of this individual ethic of “do-it-yourself” to the sphere of social policy. Besides improving the plaster and the work ethic of public housing residents, a DIY spirit will also relieve the state of its obligations to them. And so, like its close cousins “local,” “artisanal,” and the neologisms “hacker” and “maker,” DIY is a practice of middle-class consumption masquerading as a practice of citizenship. And like the cult of entrepreneurship, such uses of “DIY” reframe social disempowerment as individual achievement, delegating to citizens social costs without giving them any social power in return. It is a lamentable sign of our times that 1) someone can seriously propose that public housing residents, mostly people of color, should work without pay for their landlord and that 2) such a proposal pretends to be “progressive.”
As Steven Gelber has argued, the rise of “do-it-yourself” as amateur home repair dates to the middle of the 20th century. By 1950, the classified section of Popular Mechanics advertised an array of tools and tutorials to do-it-yourselfers.More Americans lived in owner-occupied homes than ever before—30 million by 1960, 10 times the number in 1890—and a majority worked for someone else. The growth of home ownership and the separation of home and work space created the conditions for doing it yourself as a middle-class, mostly male pursuit.
From Popular Mechanics, Mar. 1960: do-it-yourself leathercraft, welding, laminating, and…will-writing.
“When industrialization separated living and working spaces,” Gelber writes, “it also separated men and women into non-overlapping spheres of competence.” But the desire to do-it-yourself came not just from economic necessity, argues Gelber. It was a satisfying hobby for desk-bound workers and a respectable way for men to share the labor of the home while asserting a degree of autonomy and expertise within it. Even as the exclusively male claim on “do-it-yourself” culture has frayed, any Home Depot commercial or Tim Allen rerun will remind us of the anxious performance of masculinity that comes with doing it yourself.
It’s not clear (to me) when DIY regularly appeared as an acronym, but many contemporary uses of the word draw on its association with the print style of self-published punk fanzines and the anti-professionalism of punk more generally. Historians of punk often credit the short-lived 1976-77 London zine Sniffin’ Glue with popularizing the DIY aesthetic—a graphic language built on Xeroxed pages and hand-written or cut-and-pasted type, and a writing style celebrating the close, collaborative networks of authors, bands, and artists.
As Sniffin’ Glue’s creator Mark P. insisted, however, the impulse towards self-producing streamlined industrial products—whether they are music magazines or manufactured goods—goes back further, to other forms of sports and music zines in Britain and to countercultural publications like the 1960s publication Whole Earth Catalog, subtitled “Access to Tools.”
In a recent essay in The New Yorker, Evgeny Morozov makes an insightful critique of the contemporary celebration of “makers” and “hackers,” which borrows rhetorically from the rebellious posture and community-mindedness of punk DIY. He traces it further back, to Whole Earth and to the turn-of-the-century Arts and Crafts movement. (To me, punk DIY, as a specifically media movement, seems different, since punk zines never pretended to be reforming the industrial labor system, and therefore had less of the apolitical hubris that for Morozov fatally compromise the Arts and Crafts and 60s “maker” movements). Arts and Crafts, as Jackson Lears has also written, responded to regimentation and inequality in modern industry by reviving old methods of craft production. By restoring to the worker the autonomy the factory had taken away, the movement would also provide consumers with the beauty they were missing. Yet without structural reforms of the economic system, critics pointed out, Arts and Crafts, which aimed to liberate workers, just became a niche market for middle class consumers. Morozov levels the same charge at so-called “makers” today, who see “ingrained traits of technology where others might see a cascade of decisions made by businessmen and policymakers.”
“Workers of the world, disperse.”
Supplement to the Whole Earth Catalog, January 1971 (via moma.org)
Morozov quotes one of the maker movement’s apostles, Kevin Kelly, who writes in his book, Cool Tools: “The skills for this accelerated era lean toward the agile and decentralized.” This technophilic rhetoric of speed, nimbleness, and decentralization in the individual parallel the celebration in the corporate world of the same values for capital. As in Merkel’s DIY fix for public housing, which imagines the collective of public housing residents as an assemblage of atomized, vulnerable “yourselves,” the DIY celebration of autonomy can be easily colonized by a corporate zeal for individualism. (To make the link with government austerity even clearer, Merkel ends her column by saying that public-housing residents could “take pride in his or her home…and save the city millions.”)
If, as some have argued, the abuses of the “sharing economy” fall hardest on women and in female-identified professions, then it is no surprise that “DIY,” once the male preserve of Popular Mechanics and This Old House reruns, now markets itself mostly to women. And of course, we should applaud fewer Tim Allens, fewer macho tool commercials, fewer uses of the phrase “man cave.” Yet the shift in the gendering of DIY also confirms Gelber’s argument that “doing-it-yourself” was a form of productive leisure that also reproduces gender roles in the home. Search Twitter for “DIY” and you will find women’s magazines offering plans for DIY jewelry, Martha Stewart’s DIY pumpkin spice latte, even something called a “DIY chicken coop chandelier.” Much of this usage, which seems to want the anti-establishment posture of Whole Earth or Kill Rock Stars, drains the phrase of the particular meanings it once had (there’s no solidarity in a pumpkin space latte) or even any meaning at all (didn’t “DIY dinner” used to be called “cooking”?).
And then there is this: “Drone it yourself,” a military-style drone you can assemble and launch all by yourself.
On the other hand, as an Assistant Professor of English, I know only too well the dangers of failing to innovate. For example, I am often forced to talk to human students who are sitting in bounded classrooms often wired for multimedia applications I am unable or simply unwilling to use. Paper books are an obsolete technology barely worthy of the word, and poetry, despite its promising shortness, takes far too long to understand. These hardships have granted me an acute understanding of the innovation deficit your department so bravely seeks to overcome.
In spite of English Literature’s disciplinary hostility to “innovation,” change agency, and both entre- and intra-preneurship, my training as a literature scholar would offer immediate benefits to your department’s offerings in Social Innovation. For example, I would be pleased to proofread your job advertisements, in order to innovate their presently sub-optimal levels of intelligibility.
The professorship is open to both distinguished practitioners, especially those with a deep understanding of social entrepreneurship, and to tenure-level scholars in fields related to social innovation, including social entrepreneurs, social intrapreneurs and, more broadly, social change makers.
“Social entrepreneurs” are not a field, as the sentence’s syntax suggests, and that final clause could be made nimbler by using the adjective “social” only once, as here: “social entrepreneurs, intrapreneurs, and change makers.” In addition, it’s not clear that “change makers” constitutes a broader category than “entrepreneurs,” yet neither is it obviously more specific. Given my exposure to creative industries like literature, I would be excited to invent more terminology to make this list of synonyms for “businessman” even longer.
But innovating new ways of saying “entrepreneur” isn’t the only thought-leadership I would exercise within the field of Innovation Studies. As thinkfluencers have argued persuasively, disruption must occur not only within fields and businesses but institutions and organizations. My first intrapreneurial initiative, therefore, would be to fatally disrupt your (hopefully soon to be our) department. Moving our courses entirely online and replacing department faculty other than myself with low-wage adjuncts armed with xeroxes of J.S. Schumpeter quotations would improve efficiency, reach even more students, and ultimately make a bigger difference.
To paraphrase a great disruptor: We must destroy the Professorship of Social Innovation in order to save it. I am available for immediate Skype interviews.
Earlier this month, the University of Illinois-Urbana Champaign recently took the unprecedented step of rescinding a job offer to the Palestinian-born scholar Steven Salaita, who was set to begin classes there this week. It was a unilateral move by the upper administration, apparently taken in response to a series of tweets in which Salaita condemned the Israeli bombardment of Gaza. Others have already written on the case and its implications for academic freedom—see especially Corey Robin’s blog and this op-ed by many Illinois faculty, for example. (Also check out @FakeCaryNelson on Twitter, for all the latest from a fictional version of the former advocate of academic freedom.)
In the spirit of this blog, I want to focus on the 2 official statements on the case from Illinois’ Chancellor, Phyllis Wise, and its Board of Trustees. As efforts at damage control, they are on the one hand singular in their ineloquence and ineptitude. Yet on the other hand they are familiar in their abuse of notions like “civility,” “debate,” and “discourse”—especially when the latter are “robust,” a keyword forthcoming on this blog.
As others have alreadyobserved, the letters from the Chancellor and the Board make a mockery of important scholarly concepts like academic freedom, constitutionality, and English syntax. In a key section of her letter, published as a blog post on her office’s website, Chancellor Wise reaches a cannot-and-will-not crescendo that is meant to signal to you that this is a Robust Leader speaking. It ends with an illogical mess that signals to me that this is instead a rather desperate manager (without a copy editor) grasping at rhetorical straws:
What we cannot and will not tolerate at the University of Illinois are personal and disrespectful words or actions that demean and abuse either viewpoints themselves or those who express them.
Viewpoints, of course, can’t be demeaned—nor is there any attempt to explain what constitutes “personal,” “disrespectful,” demeaning, or abusive words, much less the combination of all four, much less still the relationship between viewpoints and those that express them.
Among these other sins, though, Wise’s short letter is also rather redundant: it uses “diverse and diversity” 4 times, “discourse” three times, and “civil” or “civility” 3 times. To quote her again at length:
Some of our faculty are critical of Israel, while others are strong supporters. These debates make us stronger as an institution and force advocates of all viewpoints to confront the arguments and perspectives offered by others. We are a university built on precisely this type of dialogue, discourse and debate.
Note the redundant use of “dialogue, discourse and debate” here, in which all 3 are treated as identical concepts, their differences elided in the banal, alliterative evocation of intellectual life as imagined by bureaucrats—a sing-songy pantomime of actual thinking.
The follow-up letter from the Board of Trustees doubles down on Wise’s careless invocation of “civility” as the highest virtue of intellectual life. They use it as part of a grander claim about the university’s social and political mission:
Our campuses must be safe harbors where students and faculty from all backgrounds and cultures feel valued, respected and comfortable expressing their views…The University of Illinois must shape men and women who will contribute as citizens in a diverse and multicultural democracy. To succeed in this mission, we must constantly reinforce our expectation of a university community that values civility as much as scholarship.
Disrespectful and demeaning speech that promotes malice is not an acceptable form of civil argument if we wish to ensure that students, faculty and staff are comfortable in a place of scholarship and education. If we educate a generation of students to believe otherwise, we will have jeopardized the very system that so many have made such great sacrifices to defend.
(Please note, just as an aside, the allusion to American military casualties, and the consequent suggestion that the war dead gave all for the Illinois Board of Trustees.)
The Board’s combination of scholarly “civility” and democratic citizenship brings together two threads in the use of this vague, popular term. Besides the above, think of the “Civility Caucus” in Congress, or the regular lamentations in the press at election time that inter-party squabbling is too “coarse” and hostile. In all these cases, the celebration of “civility” conflates the tone of disagreement with disagreement itself, and ultimately suppresses both. As I wrote in a longer essay on the subject in Guernica:
The desire for civil discourse in mainstream politics conceals a deeper desire for a politics of consensus, with no major points of either ideological or practical disagreement. In this view, politics becomes simply a process of managing government bureaucracy; fundamental social conflicts do not exist, only rhetorical ones do.
The other trouble with “civility” is that it is unclear what it means, or if it means anything. In the Salaita case, if his offense is anti-Semitism—a demonstrably untrue charge—than it should be enough for Wise to denounce him for that alone. Instead, as Brian Leiter writes in a piece on the Salaita affair, “incivility” seems here to simply mean bad manners—something nobody should want university administrators adjudicating, nor people losing their livelihoods over.
Of course, these notions of civility (and again, Wise’s related four D’s—debate, discourse, diversity, and dialogue) as the glue holding campuses together are always summoned by administrators as rhetorical weapons against particularly troublesome campus dissenters. So on the simplest level, “civility” is merely an invention to discredit your opponent’s point of view as irrational. Given the word’s etymological links with “civilize” and “civilization,” this is a mode of attack with which Palestinians like Salaita are likely quite familiar.
A photo of Cary Nelson (at left), uncivilly blocking traffic at the NYU library in 2005, during the graduate assistant strike (via Mondoweiss)
As a graduate student at NYU during a 2005-06 strike by the graduate employee union, we heard a lot of civility talk from university administrators who were hostile to graduate assistant unionization but were unwilling to honestly say why. NYU loved to intimate that our parent union, the UAW, would try to rewrite syllabi, that unionization would forever sully ties between faculty and students, that it was hostile to undergraduates.
As with so many keywords beloved by university administrators— “innovation,” “entrepreneurship,” and so on—there is an opportunistic element of the sacred, or at least the sacrosanct, in these treatments of the university. Once administrators feel threatened, campuses become halls of peaceful contemplation, “safe harbors,” as the Illinois Board of Trustees puts it, from the tumult of the world outside.
For academic workers, viaCorey Robin: If you want to join a specific pledge from a discipline or wish to sign the general statement, here are the critical links:
As the militarized police occupation of Ferguson, MO, drewcomparisons between the midwestern suburb and a “foreign authoritarian country,” the town’s police chief affected a different sort of vocabulary in one of his press conferences. [Put aside, for a moment, the deep naivete of a writer, like this one for Vox.com, so stymied by violent repression in the United States, God’s country and freest land on earth, that he must invoke “Middle East dictatorships” as the only available comparison for the images on his TV screen.] The Ferguson PD released the name of the uniformed killer of young Mike Brown, theBoston Globe reported,after consulation with “stakeholders”:
Obviously the decision was taken at the highest levels of the local police brass; likely Missouri’s governor and the Department of Justice had a role in the decision. Nothing this police department has done yet smacks of consulation or transparency, so the likely trained recourse to the discourse of”stakeholders” is laughable here. Stakeholder, as I argued in an earlier post, is an austerity keyword that started in business schools and has migrated into the world of municipal government, non-profits, and organizations of all types. The word has financial origins, but it aims to reassure audiences that what they are witnessing is an egalitarian partnership, not a hierarchical enterprise, at work. As I wrote then:
Like other phrases derived from gambling and finance that have migrated into democratic politics—the appropriately gruesome phrase “skin in the game” comes to mind—stakeholder conflates access with rights, obscuring hierarchies of power under the veneer of cooperation.
A determined group of citizens in Ferguson seem undeceived by the laughably thin veneer of cooperation on display there, however.