Archive for the ‘Book Reviews & Essays’ Category

Among the Nonbelievers

Posted on: December 15th, 2020 by Rebecca Tuhus-Dubrow

Last May, the writer Tara Isabella Burton published a piece in the New York Times Sunday Review about a nascent faith community. A growing number of spiritually hungry young Americans, she wrote, were “finding solace in a decidedly anti-modern vision of faith”—attending Latin Mass, belting out Gregorian chants, even wearing veils to church. Some of them, including the author, had begun to call themselves “Weird Christians.”

Burton described the alluring aspects of traditional liturgy, from its intrinsic beauty to its historical role in providing succor throughout the ages. She wrote of the power of religious aesthetics—of grand ecclesiastical architecture, pungent incense, and haunting melodies—and reported that for some young Christians, their faith was wedded to progressive political commitments. Christianity, they insisted, could be “a bulwark against the worst of modernity,” offering an alternative to the dehumanizing gig work and ruthless Tinder swipes of 21st century capitalism.

The article offered a seductive glimmer of transcendence in an era that seems distinctly hostile to it. One might assume that Burton’s new book, Strange Rites: New Religions for a Godless World, would expand on this intriguing subculture. In addition to a tour of Weird Christianity, a reader might expect excursions into Weird Judaism, Weird Islam, Weird Hinduism, and other idiosyncratic expressions of traditional religion. But it turns out that Strange Rites is not a study of how people are adapting long-standing faith traditions to the current moment. Instead it mostly explores something like the opposite—that is, how people are infusing new cultural forms with quasi-religious meaning. To tell this story, Burton covers, among other familiar milieus, fan fiction, the wellness industry, polyamory, social activism, and techno-utopianism. In short, she has written a book that takes contemporary American culture and views its most prominent movements through a religious lens. In doing so, she offers an insightful but limited overview of how these movements are providing four pillars (meaning, purpose, community, and ritual) that Americans were at one time more likely to seek at their local house of worship.

*

Burton starts her book by establishing the ostensible faithlessness of the contemporary United States. About a quarter of American adults, she notes, say they are religiously unaffiliated, and for those born after 1990, that number climbs to almost 40 percent. “In fact,” she writes, “the religious Nones, as they are often known, are the single biggest religious demographic in America, as well as the fastest-growing one.” Yet she argues that if we look deeper, we will see a more nuanced picture of these supposedly godless Americans. Almost three-quarters of the Nones profess belief in some sort of higher power, and many report engaging in practices that might be broadly considered religious. She dubs this group the “Faithful Nones.”

There are also those who consider themselves spiritual but not religious (an identity selected by 27 percent of Americans); many of them pray or meditate and might even identify with a particular faith tradition but do not regularly participate in organized religion. Finally, the “religious hybrids,” as Burton calls them, formulate their own syncretism. A representative member of this subset might, for example, occasionally attend services at a Presbyterian church, light Shabbat candles, and download the Headspace app for mindfulness meditation. Burton collects all of these modes of 21st century faith and spirituality under the umbrella of the “Religiously Remixed.” For her, what the Remixed have in common is a resistance to institutions and the rules and demands that come with them, yet they remain drawn to the idea of divinity or the prospect of transcendence.

Burton finds the roots of this approach to religion—which she calls “intuitionalist,” as opposed to “institutionalist”—deep in American history. In the 1800s, Transcendentalism extolled the primacy of the self against the strictures of a society that was, in Ralph Waldo Emerson’s words, a “conspiracy against the manhood of every one of its members.” Less-remembered movements such as New Thought (with its focus on positive thinking) and Spiritualism (centered on communication with the dead) likewise elevated personal experience above the authority and infrastructure of organized faiths.

At the same time, for all the historical continuity, Burton identifies two new factors that reinforce the intuitionalist bent: the rise of consumer capitalism and the advent of the Internet. Together, these juggernauts have instilled in Americans, especially younger generations, the expectation that we should be able to meet our need for the sacred in the same ultrapersonalized, digitized way that we seek romance or find tonight’s entertainment on Netflix. “Why force our beliefs into a narrow category of organized religion, with its doctrines and creeds, when we can cobble together a metaphysical system that demands of us no moral, ethical, spiritual, or aesthetic compromises?” she asks. “Why not combine meditation with sage cleansing with the odd Christmas service and its aesthetically pleasing carols?”

*

Having sketched the contours of the Remixed and the context from which they emerged, Burton goes on to explore some of the specific expressions of their improvisatory new “religions”—the ways in which they instill meaning and purpose into their communities and practices.

A few of the arenas Burton surveys have unmistakable spiritual overtones. Witchcraft, she notes, has become remarkably popular. She cites 2014 data from the Pew Research Center showing that 1 million Americans identify their primary religious affiliation as New Age, Neo-Pagan, or Wiccan. (Witchcraft falls into her purview because, as a “diffuse and decentered practice and spiritual system,” it “lends itself easily to contemporary intuitional eclecticism.”) Then there’s the wellness industry—a $4.2 trillion market, according to Burton. Companies hawking wellness often vaguely invoke spirituality and foster a cultish vibe. SoulCycle instructors shout out, “You were created by a purpose, for a purpose”; Goop sells a $185 Nepalese singing bowl and $40 tarot cards. Wellness is explicitly, if not a full-fledged religion, at least religion-adjacent.

But Burton also considers several other worldviews that would appear to be entirely secular. One is what she calls “social justice culture,” which she says adapts “the personal and individualistic tenets of New Thought (a repressive society warps our sense of cognition and ability to be our truest self) and gives it a firmly political cast: the Goliaths of society that must be struck down are racism, sexism, and other forms of bigotry and injustice.” Another belief system is the techno-utopianism of Silicon Valley, which she writes is devoted to the self-optimization enabled by technological and biological advances; for techno-utopians, transcendence means breaking free from the limits of the human body and supplementing the meager abilities of the human mind. A third is “reactionary atavism,” the domain of alt-right trolls and Alex Jones acolytes, who yearn to reinstate the old hierarchies destabilized by feminism and other forces of modernity.

Burton’s argument is that while the people who join these subcultures clearly differ from one another and are not consciously seeking substitutes for religion, their involvement is driven by some of the same motives that have always drawn people to religion: They want community, a sense of purpose, and a coherent narrative to make sense of the world. And although none of these subcultures necessarily meet all of the criteria for a religion, “they no longer have to: today’s mix-and-match culture means that the Remixed can get their sense of community from one place (an intense fandom, say) and their sense of meaning from another (social justice activism, or techno-utopianism).” And for the ritual component, they might turn somewhere else altogether, like yoga classes or a witchy subscription box.

Burton is a sharp thinker and a lucid writer. At her best, her aperçus are spot-on, as in her gloss on the “theology” of wellness: “We are born good, but we are tricked, by Big Pharma, by processed food, by civilization itself, into living something that falls short of our best life.” She adds, with perhaps just a touch of exaggeration, “The implicit mantra of wellness is equal parts Ayn Rand and John Calvin: you’re not just allowed but in fact obligated to focus on yourself—but, no matter how much you do, it will never be good enough.”

At other times, the religious framework is more of a stretch. She writes that the adherents of both techno-utopianism and social justice culture “seek not salvation out there, but a purification down here, a kingdom of heaven that can be realized fully on this earth, rather than in a world to come…. To transcend biology or to transcend deep-rooted prejudices is to achieve a kind of earthly divinity.” While these parallels are provocative, they don’t entirely hold up under scrutiny.

Burton’s take on the world of progressive activism is sympathetic though not uncritical. She is not wrong that many who are committed to social justice find community and a sense of purpose in activism and shared values—and, in some cases, exhibit a zeal equal to that of a devout believer. But much of today’s social activism grows out of the imperative to change unacceptable conditions and address collective problems, not from a desire to fill a spiritual absence or achieve “earthly divinity,” and her portrayal at times elides that distinction.

In her discussion of techno-utopianism, Burton draws several illuminating connections between that ideology and common religious precepts. The belief that individuals can be liberated from biological constraints, including death, has clear analogues in the religious belief in a soul and the afterlife, and the expectation of a coming Singularity, in which humans and technology fully merge, resembles eschatological visions. Yet here, too, the religious analogies occasionally seem forced, as when Burton argues that techno-utopians see the body as the source of “moral evil”—“those mortal meat sacks and shifty synapses that keep us from achieving our full and fully rational potential.” It’s a clever line, but the religious concept of evil in this context feels misplaced.

Burton’s attitude toward the Remixed landscape can be a bit hard to gauge. She sometimes appears to celebrate its freedoms and ingenuity, but at other times she lets loose quips that reveal a more disdainful attitude. (“Meaning, purpose, community, and ritual can all—separately or together—be purchased on Amazon Prime.”) She presents herself mostly as a neutral guide, although she does, unsurprisingly, express disapproval of reactionary atavism, which she calls the “most dangerous” of these movements.

She acknowledges that there is good reason for the decline of trust in institutions. After all, although she doesn’t delve into their history, old-school religions have been responsible for untold repression, violence, and misery over the centuries; many would see their withering as a sign of progress. But she also emphasizes what would be lost without new spiritual institutions. She is critical of the valorization of the self that she finds in much of Remixed culture. This is particularly true in the realms of wellness and self-care, which are interwoven with most of the other subcultures—even Jones sells his own version of wellness with products like Super Male Vitality supplements and Wake Up America Immune Support Blend 100% Organic Coffee. Burton cites blog posts and Instagram hashtags that encourage self-care even when it means ditching friends and shirking responsibilities. She worries that in some cases a spiritual patina could “lend legitimacy to behavior that might, in other moral systems, be considered merely selfish.”

*

In a way, while not nearly as ambitious, Strange Rites can be seen as an effort to update William James’s canonical 1902 book The Varieties of Religious Experience. He also distinguished between institutional and what he called “personal” religion and focused on the latter. While acknowledging that religion was nearly impossible to define, he offered this definition for the purpose of the book: “the feelings, acts, and experiences of individual men in their solitude, so far as they apprehend themselves to stand in relation to whatever they may consider the divine.” (He granted that the “divine” might not include the supernatural, citing Buddhism as an example of a nontheistic religion.) His account drew chiefly on the testimony of these “individual men”—and a few women—about their interior spiritual states.

James took pains to put into words what he termed “religious feeling.” “There must be something solemn, serious, and tender about any attitude which we denominate religious. If glad, it must not grin or snicker; if sad, it must not scream or curse,” he wrote. Although it was “personal,” it was not about asserting or gratifying the self; on the contrary, in these experiences, the sense of self tended to dissolve. “The essence of religious experiences, the thing by which we finally must judge them,” he added, “must be that element or quality in them which we can meet nowhere else.”

Surprisingly, even though Burton’s subject is also a more personal style of religion, religious feeling is all but entirely absent from her account. Her four pillars are admittedly invaluable, and perhaps their confluence has the potential to produce religious feeling. But without that feeling, those other ingredients may not add up to anything greater than the sum of their parts.

One reason for the omission may be that Burton does not seem to have extensively spoken with her subjects. Only a handful of quotes are attributed to interviews, and most of her observations seem based on evidence gleaned from books and surveys, the media and social media. Her analysis centers on broad cultural movements rather than the individuals who make them up. As a result, Strange Rites does not examine the interior experiences of the Remixed in any depth—and interior experience is, of course, central to any authentic engagement with religion.

That said, there is another possible reason for the omission: The “solemn, serious, and tender” mood that James described may be increasingly elusive in today’s world. Burton repeatedly stresses the centrality of the Internet to the Remixed scene. Much as the rise of Protestantism is considered “inseparable from the invention of the printing press and the spread of mass literacy,” so, too, does technology shape our current religions. “Anti-institutional, intuitional self-divinization is, at heart, the natural spirituality of Internet and smartphone culture,” she asserts. The Internet is ideal for finding like-minded souls, for learning about new belief systems, for purchasing the perfect herb-infused crystals. But if I wanted to invent a mechanism for thwarting the emergence of that reverent, self-forgetting mood, I doubt I could do much better than a smartphone: beckoning with temptations, by turns inflating and deflating the ego, all while buzzing with alerts of (usually bad) news from the outside world.

This is not to say that religious feeling is impossible to achieve in our high-tech society. From Burton’s piece on Weird Christianity, one gets the sense that her church’s evening prayer services are suffused with religious feeling—even though, because of the Covid-19 pandemic, they were taking place online. In Strange Rites, however, Burton refers just once in passing to her own faith before quickly adding, “But that’s a story for a different book.”

Heading in One Direction

Posted on: January 26th, 2020 by Rebecca Tuhus-Dubrow

Anyone can be a fan, and almost everyone is. Perhaps this low bar to entry explains why fans don’t get much respect: a fan is a follower, a hanger-on, one in a crowd of interchangeable masses. Then there’s the curious behaviour of the most fervent enthusiasts – from screaming Beatlemaniacs to today’s online superfans, whose defence of their idols against detractors can turn vicious. In our culture, the archetypal fan tends to be mildly pathetic at best, and at worst, downright scary.

Hannah Ewens wants to change that reputation. In Fangirls: Scenes from modern music culture, she sets out to interrogate these caricatures and to pay tribute to the richness of the fan experience. She also seeks to complicate the usual hierarchy, pointing out the dependence of artists on their fans. “I wanted to look (and with care) away from the stars themselves towards the people who gave them any luminescent quality”, she writes. In particular, she focuses on teenage girls, partly because they make up such a large portion of many fanbases – their devotion and money are indispensable to the system – and in part because they bear the brunt of public ridicule.

To research the book, Ewens embarked on a kind of tour of contemporary fandom. She camped out on London pavements in front of music venues with girls determined to secure a spot in the front row; fans, she notes, have made waiting “a special experience that’s as fun, often, as the thing itself”. She watched Crazy About One Direction, an infamous documentary about obsessive One Direction fans, more than ten times, and didn’t see “anything extraordinary or necessarily unhealthy in the girls’ behaviour”. She interviewed “Little Monsters”, Lady Gaga’s superfans, some of whom were lucky enough to meet their idol in person: “They are overwhelmed by her physical form, the fleshy reality of her. It’s the one part they haven’t had access to yet, because Gaga puts so much else out there for her fans”.

Among these dispatches, Ewens intersperses historical context and insights from academic literature on fandom. One of her achievements is to reveal what has remained constant over the decades and what, as a result of various technological and cultural shifts, has changed. In one passage, she examines the “sexual pack behaviour” that teenage girls have often performed. Citing work by Barbara Ehrenreich and others on Beatlemania, Ewens observes that when girls in the 1960s fetishized boy bands “they could vocalise any sexual desire they had in ways they never could before. It just took a crowd of them to feel normal”. This phenomenon takes another form in the twenty-first century: girls typing out lewd comments to their idols on Twitter or Instagram. “On an individual level they don’t want to reveal themselves as a fan who would ‘cross the line,’ but as part of a stream of comments they’ve all made together it’s clearly acceptable”, Ewens writes. “Taken as a mass, the sexual braying feels benign.”

The author is an energetic reporter, and she was wise to recognize fandom as a fertile area of inquiry. Yet while she interviewed hundreds of women and girls, and her reporting took her as far as Japan, Ewens rarely includes vivid descriptions of either people or places. What does come through strongly is the nature of fandom itself: it is a potent blend of identity, community, fantasy and the intense responses people have to music they love. “To be a fan is to scream alone together”, Ewens concludes. “To go on a collective journey of self-definition. It means pulling on threads of your own narrative and doing so with friends and strangers who feel like friends.”

Ewens’s resolutely celebratory approach does not leave her much room to explore “toxic fandom”. But if she declines to judge unhealthy behaviour, she seems not to partake in it either. She comes across as introspective and restrained. At one point, it looks like she might have the opportunity to meet her own idol, Courtney Love, but she realizes that she doesn’t want to. “My adoration had almost transcended any need to meet her … I had to protect everything that she had been for me.” Reflecting on her adolescence, she recognizes the psychological usefulness of her fantasies about Pete Wentz of Fall Out Boy and other male musicians: “They – crucially – didn’t know me, so couldn’t appear to ruin my mythology of them, disrespect me or worse, reject me”.

With these astute perceptions, Hannah Ewens reminds us that fans are not passive and predictable recipients of an artist’s greatness. Our idols give us their music and their personas; we bring to these our own desires and projections, and we take what we need. Sometimes the stars themselves are almost beside the point – sometimes they are the ones who could almost be interchangeable.

Equipment for Living

Posted on: June 7th, 2018 by Rebecca Tuhus-Dubrow

In December of 1934, an unemployed stockbroker named Bill Wilson checked himself into Towns Hospital in Manhattan. He had a habit of consuming more than two quarts of whiskey per day, and his wife had implored him to get help. The doctor gave Wilson an extract of belladonna, a plant with hallucinogenic properties, which at the time was an experimental treatment for alcoholism. That afternoon, the “room blazed with an indescribably white light,” Wilson later wrote. A vision of a mountain came to him. “I stood upon its summit where a great wind blew…. Then came the blazing thought, ‘you are a free man.’”

Bill Wilson never drank again. He went on to found Alcoholics Anonymous, the grassroots organization that has helped millions of people achieve and sustain sobriety. The story of Wilson’s spiritual awakening figures prominently in AA mythology. The part about the preceding drug dose does not.

Wilson’s dabbling in psychedelics—including later experiments with LSD—comes up in two new books: Leslie Jamison’s The Recovering, a memoir of drinking and quitting intertwined with literary and cultural criticism, and Michael Pollan’s How to Change Your Mind, an exploration of the awesome powers of psychedelics to enrich human consciousness. Many other authors have covered similar ground, but Pollan and Jamison bring to bear singular gifts. They are, in some ways, very different writers: Pollan is at heart a journalist oriented toward the world; Jamison, trained as a fiction writer, is drawn to her own psyche for material. But both deftly synthesize research and their own experiences into finely crafted narratives that give new life to these familiar themes.

These authors approach mind-altering substances from apparently opposite perspectives. Jamison shows how they can destroy lives and how to escape their thrall; Pollan focuses on their potential to transform lives for the better. As the story of Bill Wilson suggests, however, unexpected connections arise between the two books. Taking drugs and recovering are not always as incompatible as they seem.

+++++

Leslie Jamison came to widespread attention in 2014 with the publication of her essay collection The Empathy Exams. The essays mined episodes from Jamison’s life—her abortion, her heart surgery, the time she was punched in the nose by a guy on the street in Nicaragua—for insights into suffering and what it means to try to feel the pain of others. The collection also included journalistic pieces; for one, she spent time with sufferers of a disease called Morgellons, which causes its victims to believe they have mysterious fibers emerging from their skin. Jamison sought not simply to extol empathy but to grapple with its limits and vanities. As one representative sentence put it, “empathy is always perched precariously between gift and invasion.”

Jamison, who had previously published a novel, The Gin Closet, was hailed as an heir to Susan Sontag and Joan Didion, and The Empathy Exams is undeniably impressive. Nearly every page is dense with insight, expressed in tautly constructed sentences. Sometimes reading it feels like sitting in on a therapy session with a hyper-introspective, hyper-articulate patient—an appraisal some might interpret, but I don’t intend, as pejorative. But it has shortcomings common to many essay collections. While the volume is ostensibly knit together by the themes of pain and empathy, some pieces, such as a brief account of a trip to a writers’ conference, feel like filler. The book can also come across as overly performative. It is easier to admire than to enjoy.

The Recovering covers some of the same autobiographical territory as The Empathy Exams. We hear again about Jamison’s abortion, her heart surgery, the time she was punched in the face by a guy on the street in Nicaragua. We’re back on the couch with the consummate analysand. (“I wanted to be loved because I deserved it. Except I was scared to be loved like this, because what if I stopped deserving it? Unconditional love was insulting, but conditional love was terrifying.”) Yet this time, the vignettes and self-scrutiny are presented in a more straightforward memoir, fleshed out with context and with the narrative propulsion that chronology bestows. Her prose, meanwhile, has become looser, freer, and funnier.

The humor often comes at her own expense. While working a post-college job at a bed-and-breakfast, she sneaks wine intended for the guests. “I never thought of this as drinking on the job, although strictly speaking—or really any way of speaking—it was,” she writes. Elsewhere, she recounts anticipating reactions the first time she told her story at an AA meeting. “People would compliment my story or the way I’d told it, and I’d demur, Well, I’m a writer, shrugging, trying not to make too big a deal out of it.” Instead, midway through her earnest account, a half-senile old-timer interrupts, “This is boring!”

Perhaps part of the reason he found her story boring is that there was no obvious trauma or other hardship that led her to the balm of booze. Her alcoholism was almost tautological: She needed to drink because she needed to drink. Starting at the University of Iowa, where she enrolled in the MFA program; then in New Haven, where she moved for a PhD program in English at Yale; and then back in Iowa, with some travel to Central America along the way, “Intoxication had become the feeling I was most interested in having.”

Nor were there any catastrophic consequences for Jamison. She maintained loving relationships with her family. She published a novel and continued to amass fancy credentials. During this time, she had a long-term relationship with a fellow grad student, and while her drinking caused tension, their relationship was relatively stable for years before they amicably separated. A recent profile of her in New York magazine was titled “Where’s the Train Wreck?”

Why, then, did Jamison need to quit? It was, she suggests, a matter of sovereignty over herself. “My shame about drinking wasn’t mainly about embarrassment at what I did when I was drunk,” she writes; “it was about how much I wanted to get drunk in the first place.” She drank because she needed to drink; she quit for the same reason.

Into the story of her addiction and recovery, Jamison weaves those of others, especially writers like Charles Jackson, John Berryman, Denis Johnson, and Jean Rhys. She also pays attention to noncelebrities: women convicted on drug charges in Arizona and forced to work on a chain gang in the extreme heat; alcoholics who spent time in a ragtag recovery center called Seneca House. The idea, she explains, is that all of these stories will collectively bear some resemblance to an AA meeting.

Through these stories, Jamison explores how addiction gets refracted through race and gender. White male alcoholic writers have often been lauded as tortured geniuses. White women are typically denied that status, but their substance use does often get them cast as wounded and interesting. People of color with substance-use issues, by contrast, are more likely to be depicted as criminals than as victims. These general observations are not new, but Jamison’s critique adds depth and nuance: “The crack mother was the negative image of the addict genius: She wasn’t someone whose dependence fueled her creative powers. She was someone whose dependence meant she’d failed to create the way she was supposed to.”

While this taxonomy shows how our culture divides addicts, AA meetings, in Jamison’s account, work a reverse alchemy: They bring together people of different demographics and classes. As a graduate student at Yale, Jamison finds herself at meetings with homeless men and sorority girls. In AA, social background seems to lose some salience, as does individuality. In the stories that make up the heart of the meetings, the parallels stand out to fellow AA members much more than the differences.

As a writer who had always been taught to prize originality, Jamison initially chafes against this emphasis on sameness. She wants her contribution to shine. She also cringes at the frequently invoked catchphrases: “Feelings aren’t facts” or “Sometimes the solution has nothing to do with the problem.” Ultimately, however, she comes to see the value of both of these aspects of AA. Clichés, she realizes, can serve the same purpose as mantras or prayers; their familiarity is a source of solace. They point to another way in which the individual can recede. “You weren’t responsible for what got said, because you were all parts of a machine bigger than any one of you…. Clichés were the dialect of that machine, its ancient tongue.” As for the repetitiveness of the testimony, Jamison begins to cherish the resemblances between her story and those of others who shared the same struggles and overcame them. “Our stories were valuable because of this redundancy, not despite it.”

In writing The Recovering, Jamison reveals, she wrestled with these challenges: not only how to tell a story that has been told many times before, but how to reconcile her literary impulse for originality with her newfound appreciation for the virtues of clichés and redundancy. Part of her answer is to incorporate this conundrum into her inquiry. She salutes the value of unoriginality but does not embody it. Her analytical sharpness and assiduous attention to words are the very reverse of reaching for the nearest truism.

+++++

A running theme throughout The Recovering is the relationship of alcohol to truth. “In vino veritas was one of the most appealing promises of drinking: that it wasn’t degradation but illumination, that it wasn’t obscuring truth but unveiling it,” Jamison writes. For her, at least, that promise proved illusory. But as Pollan’s book argues, psychedelics really can deliver illumination. While they have acquired associations with visual hallucinations, users overwhelmingly report that they don’t distort reality so much as reveal it for the first time. The other, related hallmark of the psychedelic experience is the dissolution of the ego, the melting of boundaries between the self and the world. These two features make psychedelic trips revelatory, sometimes mystical experiences that can affect their beneficiaries for years.

Pollan’s oeuvre is usually associated with food, but his subject, really, is broader: the intersection of humanity and nature. His second book, A Place of My Own (1997), took on building and architecture: how people convert the planet’s materials into shelter. And now his latest book explores how certain earthly substances can change our consciousness in astonishing ways. (LSD, which we think of as the most “synthetic,” originates in a fungus known as ergot, Pollan reports.)

Pollan always researches his subjects exhaustively and doesn’t shy away from getting his hands dirty, often literally. For his book on architecture, he built a hut in his backyard; for The Omnivore’s Dilemma, he shot a pig. For this book, it was probably inevitable that he would seek to acquire firsthand knowledge of the wonders of psychedelics, although doing so pushed him outside his comfort zone. “I generally prefer to leave my psychic depths undisturbed, assuming they exist,” he writes. Still, he overcame his trepidation to embark on several psychedelic “journeys.”

Pollan has also long demonstrated an enchanting facility with the English language and a knack for conjuring the offbeat characters he encounters in his research and reporting, from Johnny Appleseed to the entrepreneurs of organic farming. As opposed to Jamison’s quotable one-liners, his gifts manifest in a playfulness whose magic accretes over paragraphs. This virtuosity and charisma are less evident in How to Change Your Mind (starting with the title, which is no Omnivore’s Dilemma). Perhaps ironically, given the topic, the writing is more, well, sober. But it is always lucid, and there are parts—such as his portrayal of an eccentric mycologist who considers mushrooms to be a virtual panacea for the world’s ills—where his old mischievous charm reappears.

Pollan starts by reviewing what he calls a “renaissance” in the study of psychedelics. A rich body of research was conducted by scientists in the mid–20th century. But after Timothy Leary famously urged an entire generation to drop acid in the 1960s, and the drug escaped the bounds of the lab, panic ensued. Before long, federal funding dried up for research on these substances, which were now seen as unacceptably subversive.

Starting in the early years of this century, however, the US government began to quietly sanction new research into these drugs. The new studies have corroborated the findings of past work and extended them, revealing the power of psychedelics to ease the fear of dying, to break addictions, to overcome depression, and to occasion spiritual experiences in that part of the population known as “healthy normals.” Crucially, the subjects in these experiments take the drugs under controlled conditions intended to maximize the likelihood of a “good trip.” They do so in comfortable rooms, with vaguely New Age interior design, often lying on couches, wearing eyeshades, and listening to music. Most important, their trips are overseen by trained guides who gently give instructions such as “Trust the trajectory” and “TLO—Trust, Let Go, Be Open.”

Pollan interviews a number of subjects and researchers, and they unanimously rhapsodize about their drug-induced odysseys. A researcher named Bill Richards tells Pollan: “‘Awe,’ ‘glory,’ and ‘gratitude’ were the only words that remained relevant.” Like Jamison, Pollan sometimes winces at the clichés he encounters, though he recognizes that the problem lies partly in the inadequacy of language to capture these ineffable experiences. Sometimes the people providing the reports are themselves self-conscious about this. One researcher wrote: “I have at times been almost embarrassed by them, as if they give voice to a cosmic vision of the triumph of love that one associates derisively with the platitudes of Hallmark cards…. Love conquers all.”

Pollan came of age in the 1970s, in the midst of the LSD backlash, and his exposure to psychedelics was limited to a couple of mild trips on mushrooms. Now approaching 60, he takes a series of trips, all but one under the supervision of underground guides. (He had hoped to participate in a study, but a suspension of research in “healthy normals” eliminated that option.) While apprehensive, he is reassured by his research: Psychedelics are actually very safe; there is no known fatal dose, nor are they addictive.

As we might expect from a writer of Pollan’s caliber, his accounts of his trips largely avoid the generalities and platitudes that characterize the typical descriptions. He tries valiantly to chronicle his experiences with fidelity and specificity. “The word and sense of ‘poignance’ flooded over me during the walk through the garden,” he writes of a mushroom trip. “[O]ne’s usual sense of oneself as a subject observing objects in space—objects that have been thrown into relief and rendered discrete by the apparent void that surrounds them—gave way to a sense of being deep inside and fully implicated in this scene, one more being in relation to the myriad other beings and to the whole.”

Pollan goes on to have more intense experiences at higher doses. He is flooded with love for his family, compassion for various people from his past (his beleaguered fourth-grade music teacher makes an appearance), and gratitude not just for his life but “for the very fact of being, that there is anything whatsoever. Rather than being necessarily the case, this now seemed quite the miracle, and something I resolved never again to take for granted.”

Other than the insights commonly delivered by psychedelics, Pollan arrives at several additional conclusions. He learns that one unusual aspect of the effects of these drugs is their durability. It’s not the chemical reaction that matters; it’s the resulting experience, which, afterward, remarkably, continues to seem legitimate. Users consistently believe, after the chemical has worn off, that they’ve been granted access to great truths, and often the revelations stay with them and change their lives in profound ways. To increase the odds of such outcomes, Pollan comes to believe in the critical importance of a set of rituals, guidelines, and authorities to direct the powerful experiences unleashed by these molecules. Indeed, other societies that sanction the use of psychedelics have typically put such protocols into place. The imperative to do so, he realizes, might have been the key lesson of the 1960s.

Pollan also revises his understanding of the word “spiritual.” He had always associated it with a belief in the supernatural, which he didn’t, and still doesn’t, possess. But his psychedelic excursions showed him the possibility of transcendence that required no faith; it was a matter of seeing and feeling more deeply and of loosening the grip of the ego. “The usual antonym for the word ‘spiritual’ is ‘material,’” he writes. “Now I’m inclined to think a much better and certainly more useful antonym for ‘spiritual’ might be ‘egotistical.’”

Finally, another peculiarity of psychedelics, Pollan shows, is that they often lead their enthusiasts to become evangelical about their potential usefulness for all of humanity. This impulse makes sense, and not just from an altruistic perspective. After all, people convinced of the unity of all beings and the supreme importance of love don’t typically become terrorists or Twitter trolls. “I believe this could revolutionize mental health care,” one researcher tells Pollan, an opinion prevalent among psychedelic researchers.

+++++

For many who are familiar with psychedelics, it is intuitive that they could help ease anxiety about dying and lift depression. Less intuitive is the notion that a drug might hold the key to surmounting addiction. And yet psychedelics seem to hold great promise in that regard as well. The mechanism appears to be a kind of “reboot of the system—a biological control-alt-delete,” one researcher says. A potent experience can shake addicts out of ingrained mental patterns and grant them new flexibility, while putting the cravings of the self into a larger perspective.

Pollan speaks to several participants in a smoking-cessation
 study, which offered cognitive-behavior therapy followed by the administration of psilocybin (the active ingredient in “magic mushrooms”). It was a small study, but the results were striking. Six months after their trips, 80 percent of the participants had not resumed the habit. (A year later, this figure had dropped to 67 percent—still better than the results obtained by established methods.) One participant told Pollan: “It put smoking in a whole new context. Smoking seemed very unimportant; it seemed kind of stupid, to be honest.” As for alcoholism, the evidence is similarly intriguing, although more research is needed. In the 1950s through the ’70s, thousands of alcoholics received psychedelic treatment, but many of the studies had flawed designs. A 2012 meta-analysis of the best studies, however, did find a “significant beneficial effect on alcohol misuse” from one dose of LSD, lasting up to six months.

Here we return to the parallels between psychedelics and AA, some of which Pollan notes. Both involve a recognition of a power beyond the self (not necessarily supernatural); both encourage a diminution of the ego and an embrace of connection with others. An integral part of AA is helping others to achieve sobriety, just as the evangelists for psychedelics seek to promote the benefits of these extraordinary molecules.

But, of course, psychedelic trips and the work of a 12-step program are also very different. A trip on mushrooms or LSD is passive: You feel that “the doors of perception,” as Aldous Huxley famously put it (borrowing a line from William Blake), are opening for you. And this state of mind is not sustainable; even if the insights can stay with us, it would not be practical to cry with joy all day as we floss our teeth and drive to work and help our kids with math homework. AA, by contrast, is mundane and involves effort—sometimes very painful effort. It’s about showing up even when you don’t want to. It’s about drinking bad coffee in unpleasantly lit church basements. It’s about going through the motions on the days when you’d really rather knock back a martini or six. It’s about realizing that external actions are sometimes more important than your internal mind-set—and that the former can change the latter. As Jamison beautifully puts it, “Action could coax belief rather than testifying to it.”

A distinction is frequently drawn between religion and spirituality, two different but overlapping spheres. In this context, it seems to me that psychedelics embody a certain form of spirituality—direct access to revelation, a realm where words are both inadequate and unnecessary—while AA typifies religion, meaning a set of rules and rituals performed in the context of a supportive community. In a religion, words are essential: the text of the sacred scriptures (AA’s Big Book, the 12 steps, the sayings) as well as the primary means of communicating with co-religionists (recovery stories).

Perhaps that’s the lesson we can derive from both of these books: We need the epiphanies and the rites, the inward reflection and the community, and perhaps part of the problem with modern life is that these are so often missing. The paths of these two authors may differ, but both offer us some equipment for living in a fuller and more authentic fashion. Psychedelics are not the only route to mystical experience, but they provide an unusually reliable introduction to that state of mind. AA tells its members to acknowledge the limits of their autonomy, to commit to unsparing honesty with themselves, to dedicate their lives to helping others. We could all do worse than to live by these principles—even those of us who can enjoy a single glass of Pinot Grigio and call it a night.

An Ad Hoc Affair

Posted on: February 6th, 2017 by Rebecca Tuhus-Dubrow

In 1956, Jane Jacobs was 39 years old, working as a staff writer at Architectural Forum. Her boss, unable to attend a conference at Harvard, asked her to go in his stead and give a talk on land banking. Jacobs, skittish about public speaking, reluctantly agreed, on one condition: that she could speak on a subject of her choice.

That subject, it turned out, was the utter wrongheadedness of many of the ideas cherished by her audience, the era’s luminaries of urban planning. The prevailing wisdom at the time held that “urban renewal” required clearing “slums” and starting over. The rebuilt cities would tidily disentangle residential and commercial areas and include plenty of open space. These ideas may have looked good in architectural drawings, but in real life, Jacobs had come to believe, they were a formula for lifeless monotony.

In East Harlem, she noted, 1,110 stores had been razed to make way for housing projects. Jacobs argued that these little shops couldn’t simply be replaced by supermarkets. “A store is also a storekeeper,” she said. Stores were not just commercial spaces; they were also social centers that “help make an urban neighborhood a community instead of a mere dormitory.” Even empty storefronts had a function, often sprouting into clubs, churches, or hubs for other civic activities. When these spaces were destroyed, the community was gravely wounded.

This speech, a turning point in Jacobs’s career, appears in Vital Little Plans, a new collection of her short works; Robert Kanigel’s new biography of Jacobs, Eyes on the Street, fills in the context. The men in the room—including the mayor of Pittsburgh, the head of the New York City Housing Authority, and The New Yorker’s architecture critic, Lewis Mumford—took the rebuke remarkably well, and Jacobs won some distinguished admirers. The speech was also the germ of what became her masterpiece, The Death and Life of Great American Cities (1961), one of the seminal books of the 20th century.

The talk demonstrated Jacobs’s trademark strengths: her clear-eyed vision, her pungent language, her sheer gutsiness (she may have dreaded public speaking, but she never hesitated to tell her betters what she thought). It also reflected her slippery political orientation. Jacobs was celebrating commerce and condemning government overreach in the form of public housing, and thereby showing some sympathy with the values of the right. Yet she was doing so on behalf of low-income people who, she believed, had been ill served. Like any good leftist, she was defending the underdogs: the mom-and-pop stores as well as the residents of these projects, many of whom hated their bleak housing as much as she did.

Jacobs’s unconventional politics grew out of her temperament. She was allergic to dogma; she followed not an ideology but a methodology. She did not assume, or imagine, or take things on faith; she observed. But she didn’t stop there: She accumulated observations and distilled them into general principles. For her, empiricism and theory were not opposites but complements.

Jacobs hinted at this approach near the end of her talk: “the least we can do is to respect—­in the deepest sense—strips of chaos that have a weird wisdom of their own not yet encompassed in our concept of urban order.” Her great accomplishment would be to translate that “weird wisdom” into terms we could all understand.

* * *

Jane Jacobs was born in 1916 with a decidedly less euphonious name—Jane Butzner—in the coal town of Scranton, Pennsylvania. The third child of a doctor and a teacher, she was delivered by her own father. In her childhood home, her parents encouraged her inquisitive mind and accepted her rebellious streak. Jacobs read widely, wrote poems, and held imaginary conversations with interlocutors like Thomas Jefferson and Benjamin Franklin.

Despite her precocity—or more likely because of it—the young Jane was never a good fit for school. She barely made it through high school and, instead of college, took a course in stenography. Her parents had instilled in her the importance of both learning a practical trade and pursuing her calling, which she determined early on would be writing.

In 1934, during the depths of the Depression, Jacobs moved to New York, where she lived with her elder sister Betty in Brooklyn Heights. In the mornings, she took the subway to Manhattan to interview for secretarial work; in the afternoons, she wandered around the city. On one outing, she discovered Christopher Street in Greenwich Village and promptly informed Betty that they’d be relocating to that neighborhood.

After working briefly as a secretary, Jacobs enrolled in Columbia University’s extension school. Liberated to pursue her interests, she excelled in her classes—geology, economics, chemistry, constitutional law—though she never earned a degree. Around this time, Jacobs got her first break as a writer: a series of freelance assignments for Vogue. Each of these four pieces, collected in Vital Little Plans, explored a different New York industry—­fur, leather, flowers, diamonds—and the neighborhood in which it thrived.

This period was followed by a succession of staff jobs: at Iron Age, a trade journal where she covered such scintillating topics as the role of nonferrous metals in the war effort; at Amer­ika, a propaganda magazine published by the US government and distributed in the Soviet Union, for which she started to cover urban planning; and finally at Architectural Forum.

Her personal life, meanwhile, replicated the happy stability of her childhood. In 1944, she married Bob Jacobs, a handsome architect with whom she’d have three children and enjoy a long, devoted partnership. In 1947, the pair bought a fixer-upper on Hudson Street for about $7,000. When Jacobs took the job at Architectural Forum, her husband helped her learn how to read drawings and blueprints.

Covering urban renewal for the magazine, Jacobs was initially supportive (or, as she would come to believe, insufficiently skeptical) of the ideas on which it was based. In the abstract, they had a certain internal logic. But as she reported more deeply, she began speaking with people who questioned that logic, and she noticed more disjunctions between the claims and the reality. She gradually formed her own strong opinions, and as she started to express them, she emerged fully as a writer. By the time she published her most celebrated book, she was a masterful polemicist.

Death and Life’s first sentence didn’t mince words: “This book is an attack on current city planning and rebuilding.” The heart of the work was arguably “The Conditions for City Diversity” (Part II out of four), in which she laid out the arguments that would become canonical. Against the planners who sought to neatly divide a city’s neighborhoods by use—residential, commercial, industrial, and so on—Jacobs advocated the opposite: “mixed primary uses,” or homes and stores and restaurants and offices all in close proximity. In such areas, different people were on the street for different reasons at different times of day, contributing to the vitality of the neighborhood, attracting new enterprises in a virtuous circle, and providing a continuous stream of “eyes on the street” to keep it safe.

Jacobs also advocated short blocks, with frequent opportunities for turning corners, as opposed to the “superblocks” loved by planners; buildings of different ages and conditions, in order to support a variety of ventures, including more experimental ones; and residential density, which had heretofore been seen as unwholesome congestion. All of this was conveyed in prose that was sometimes caustic, sometimes aphoristic, and always exceptionally lucid and vigorous.

After this section, Death and Life continued for another 10 chapters, discussing the “self-destruction of diversity” (what we would now call “gentrification”); “unslumming and slumming,” her pithy terms for neighborhood regeneration and decline; proposals for subsidized housing; and recommendations to “salvage” the housing projects she abhorred. It was an astoundingly ambitious book, laying down general laws drawn from sharp observation, with a healthy dose of detailed policy suggestions revealing an alertness to the practical challenges.

If Jacobs’s Harvard speech made her name in planning circles, the book launched her into broader renown. As Kanigel chronicles, The New York Times proclaimed it “a huge, a fascinating, a dogmatically controversial book.” The Wall Street Journal declared: “In another age, the author’s enormous intellectual temerity would have ensured her destruction as a witch.” But the praise wasn’t unanimous. The sociologist Herbert Gans, as well as other reviewers, pointed out several blind spots. Jacobs championed a very particular urban ideal—defined by vitality and diversity—that had no room for other virtues, such as tranquility and natural beauty. Gans also charged her with succumbing to the “physical fallacy”: overestimating the importance of factors like block length and store locations, and underestimating those like ethnicity and culture. Meanwhile, her erstwhile admirer, Lewis Mumford, wrote a New Yorker piece offering some restrained praise but also some acerbic criticism, calling the book “a mingling of sense and sentimentality, of mature judgments and schoolgirl howlers.” (His hostility may have had something to do with the fact that, despite his earlier support, Jacobs—a stranger to sycophancy—had in Death and Life called one of his books “morbid and biased.”)

* * *

During this time, Jacobs not only elucidated her vision of the good city; she invested prodigious energy in making it a reality—or, rather, in staving off its annihilation. Her Greenwich Village neighborhood found itself repeatedly embattled, threatened first by Robert Moses’s plan to ram a highway through Washington Square Park; then to designate the West Village a slum and clear it for an urban-renewal project; then to destroy nearby neighborhoods for his proposed Lower Manhattan Expressway.

Jacobs participated in and sometimes led the protests that succeeded in halting each of these plans. She and her neighbors did all the classic grunt work of community activism: making calls, writing letters, drawing up petitions, organizing rallies. Sometimes they dispatched their children to hang posters and gather signatures. Jacobs recalled that during the fight to save the West Village, “we just disconnected the doorbell and left the door open at night so we could work and people could come and go.”

From these skirmishes, Jacobs learned crucial strategic lessons. One was that the neighborhood had to fight even the early exploration of a “slum” designation, because the label would scare off businesses and home buyers, becoming a self-fulfilling prophecy. Another lesson, imparted by a sympathetic federal-housing official, was that the community should never express any affirmative desires, which would allow city officials to claim they’d received the residents’ input, but rather should focus single-mindedly on blocking the plans it opposed.

Though Jacobs proved herself a skillful activist, she never wanted to become one, seeing it as a distraction from her real work: writing. “I resented that I had to stop and devote myself to fighting what was basically an absurdity that had been foisted on me and my neighbors,” she once told an interviewer.

In 1968, Jacobs and her family moved to Toronto, primarily so that her two sons could escape the draft for the Vietnam War. But there, too, some of the same misguided views about urban planning were ascendant. No sooner had she settled in Toronto than she was fighting another expressway slated to be carved through her new neighborhood. Using the chops she’d acquired in New York, Jacobs helped to defeat the plan. Her presence was welcomed in her adopted city, where she became an éminence grise of urban planning.

Though Jacobs never repeated the sensation caused by her debut, Kanigel is at pains to stress that she was no “one-book wonder.” The Economy of Cities (1969) and Cities and the Wealth of Nations (1984) further developed some of the themes of Death and Life, with greater historical breadth. They advanced arguments that seem startlingly relevant, if almost clichéd, today: Successful cities are hubs of innovation, the real economic engines of nations; and places that fail to innovate—whether declining cities or rural towns—risk stagnancy and alienation. Her last book, Dark Age Ahead (2004), foresaw the disastrous bursting of the housing bubble that was swelling at the time.

To generate her ideas, Jacobs read deeply and idiosyncratically. As Samuel Zipp and Nathan Storring write in their excellent introduction to Vital Little Plans, “Jacobs tended to look at history the way she did a cityscape. She scouted around for promising examples of individual phenomena, situations in which city or economic life seemed to have been working, and then sought to understand the processes that organized these data into constructive systems.” At times, her bold hypotheses got a bit too far ahead of the evidence. A provocative theory that agriculture had its origins in cities risked appearing to reflect her pro-urban outlook as much as the historical record. (Her faith in cities was perhaps the closest she came to an ideology.) This sort of work strayed from her core strength: proceeding from close observation to general principle to practical application. On the other hand, one of her ideas—that the agglomeration of diverse industries could facilitate innovation—was plucked from relative obscurity and endorsed by the economist Robert Lucas, who would later win the Nobel Prize. Economists now call this phenomenon “Jacobs externalities.”

* * *

In the years since her death in 2006, at age 89, Jacobs has been routinely hailed as a heroine, both for her unmatched influence on our understanding of cities and for her role in helping to defeat many of Moses’s plans for New York. But her work has also been the target of criticism—partly because of her political heterodoxy, and partly as a reaction to her canonical status.

The criticisms often boil down to claims that Jacobs failed to sufficiently take into account the need for affordable housing or the threat of gentrification. Some even suggest that her work is partly responsible for gentrification, thanks to her celebration of neighborhoods like the West Village (where her old home was sold for $3.3 million in 2009).

But before accusing Jacobs of these oversights, one is advised to read her very thoroughly. A chapter midway through Death and Life, “The Self-Destruction of Diversity,” offers as good a description as any of the process by which quirky neighborhood enterprises get priced out and replaced by banks and insurance offices. Jacobs advances several policy solutions, including “zoning for diversity”: that is, limiting the proliferation of the same kinds of businesses and buildings—the reverse of the more usual purpose of zoning.

As for affordable housing, Death and Life explored in exhaustive detail a plan that Jacobs called the “guaranteed-rent method.” She proposed that housing would be created by private developers and landlords, and that the government would subsidize rents. For new construction, an “Office of Dwelling Subsidies” would guarantee the necessary financing to the builders. Tenants would be selected from applicants from a designated area. If their incomes rose, the subsidies would drop. But no level of income would be disqualifying, because Jacobs recognized the psychological implications of income-restricted housing: It caused a residence to become stigmatized and synonymous with failure. If nobody wanted to stay—if success meant getting out—it would struggle to become a vibrant or desirable place to live. This chapter is wonky and dense, and because it comes toward the end of a very long book, it’s easy to overlook, but it belies the caricature of Jacobs as an elitist libertarian.

Her attitude toward government, based on what she saw at the time, was that it could stifle as often as nourish the organic flourishing of cities and their residents. But Jacobs wasn’t against government intervention per se; nor was she against planning. She favored policies that she believed would foster life in cities, and she favored certain kinds of plans—namely, little ones. She acknowledged that some plans had to be far-reaching—a subway system, for example—­but, as a rule, “Little plans are more appropriate for city renewal than big plans.” In a 1981 speech in Germany, she argued that big plans by their nature tend to be boring and lacking in visual diversity because they’re “the product of too few minds.” What’s more, big plans tend to foreclose possibilities: They’re inflexible, and “when the plans are very big the mistakes can be very big also.” After all, she concluded, “Life is an ad hoc affair.”

Today, Jacobs has not only bequeathed us a legacy of great ideas; she can also serve as an exemplar of how to approach our own formidable problems, in urban planning and beyond. To follow her lead is to look closely to determine what works and what doesn’t. It is to nurture a multitude of little plans and, not least, to do all we can to stop big plans based on bad ideas.

The Annie Dillard Show

Posted on: May 26th, 2016 by Rebecca Tuhus-Dubrow

For an epoch defined by mass attention-deficit disorder, Annie Dillard would seem to be the perfect antidote. Dillard, the author of the Pulitzer Prize–winning Pilgrim at Tinker Creek (1974), is devoted to patience and to presence. “It’s all a matter of keeping my eyes open,” she has declared. She is thoroughly and ecstatically attuned to her surroundings, willing to wait hours for a glimpse of a muskrat. Her words are painstakingly selected and arranged. The contrasts with our screen-tethered, logorrheic selves hardly need to be belabored.

A chapter in Teaching a Stone to Talk (1982), called “Living Like Weasels,” is characteristic of her approach. At a pond near her home in Virginia, Dillard finds herself face to face with a wild weasel. “He was ten inches long, thin as a curve, a muscled ribbon, brown as fruitwood, soft-furred, alert. His face was fierce, small and pointed as a lizard’s; he would have made a good arrowhead.” Such descriptive sentences—as sinewy and vigorous as that weasel—are interspersed with whimsical anecdotes (sometimes seemingly apocryphal). Once, she reports, a man shot an eagle and, examining its carcass, “found the dry skull of a weasel fixed by the jaws to the bird’s throat.” The assumption, she writes, is that “the eagle had pounced on the weasel and the weasel swiveled and bit as instinct taught him, tooth to neck, and nearly won.” Then, every so often, she slips in a more philosophical musing: “The weasel lives in necessity and we live in choice, hating necessity and dying at the last ignobly in its talons.”

Dillard hasn’t written a book since the appearance of her novel The Maytrees in 2007, but now she has curated a new compilation, The Abundance, which includes the weasel essay. Though subtitled Narrative Essays Old and New, the most recent contribution (and apparently the only one not previously published in book form) is a 2002 essay from the literary journal Image. The Abundance is really a kind of greatest-hits collection culled from Dillard’s most famous books: excerpts from Pilgrim and Teaching a Stone to Talk, An American Childhood (1987), and The Writing Life (1989), among others. The pieces range from a brief, powerful account of a total solar eclipse to a long, somewhat cryptic essay that seems to compare attending church with visiting the Arctic. Cynically, the book could be seen as an attempt on the publisher’s part to squeeze some sales out of Dillard’s literary reputation. But it could also be read more generously: as a welcome occasion to discern the themes common to her work over time and to take stock of her legacy.

* * *

Annie Dillard (née Doak) was born in 1945 in Pittsburgh, then a thriving steel town ruled by a WASP elite to which her family belonged. She was the eldest of three girls, and her parents, for whom she demonstrates great affection in her childhood memoir, were wholesome bon vivants; they liked to tell elaborate jokes and throw parties where the entertainment consisted of bringing out a one-man percussion band. Hers was a world in which people knew and cared whether you were Protestant or Catholic, Italian or Irish. The neighborhood belonged to the children. “My mother had given me the freedom of the streets as soon as I could say our telephone number,” Dillard writes in An American Childhood.

In Pilgrim, she recalls an impulse she occasionally indulged at age 6 or 7: She would hide a penny along a certain stretch of sidewalk, at the bottom of a sycamore or in a hole in the concrete, then draw arrows with chalk leading to the coin from both directions. “I was greatly excited,” she writes, “at the thought of the first lucky passer-by who would receive in this way, regardless of merit, a free gift from the universe.”

When she was in her 20s, Dillard lived in Virginia’s Roanoke Valley, whose waterways and wildlife she would vigilantly observe in Pilgrim. She made it plain that she took her inspiration from another pilgrim’s sojourn, at Walden Pond: “I propose to keep here what Thoreau called ‘a meteorological journal of the mind,’” she wrote. But in contrast to Walden, Dillard’s book is starkly free of context. She shares no memories of her past, no statement of her motives, no description of the house she inhabits. (This silence is explained in part by her circumstances. According to a recent article in The Atlantic, she was living with her husband at the time, but the book is a portrait of her exploration of solitude.) Unlike Thoreau, who holds forth on topics ranging from patched pantaloons to the spread of the railroad, Dillard offers no opinions about society.

Instead, we are privy strictly to her perceptions of the natural world and the reflections they provoke. It’s a realm where coins are everywhere discoverable, hidden only to those who willfully disregard the arrows. At Tinker Creek, “There are lots of things to see, unwrapped gifts and free surprises. The world is fairly studded and strewn with pennies cast broadside from some generous hand.” For Dillard, the world is crowded with marvels: sand grains, spiders, mist at dusk. As she repeatedly stresses, the marvels are unmerited, as opposed to rewards that we somehow earn. They are, in the religious diction she often favors, a matter of grace.

Dillard’s great talent is converting those moments into language. Her debut, Tickets for a Prayer Wheel (1974), was a book of poetry, and those origins show in her prose. Throughout her work, the sentences are sharp and vivid, peppered with unexpected adjectives and with obscure, lovely nouns that sound like they come from some extinct Anglo-Saxon dialect: alpenglow, loess, chert. Her prose is almost completely purged of familiar phrases and clichés. When one does appear, she often twists it, taking it literally and bringing the dead language back to life. During the solar eclipse, she writes, it was “as dark as night, and eerie as hell,” which I take as an actual invocation of the netherworld. In a similar vein: “I’m blind as a bat, sensing from every direction only the echo of my own thin cries.”

In his incisive foreword to this new collection, Geoff Dyer praises Dillard for avoiding the “three ills of which nature writers should live in permanent dread: preciousness, reverence, and earnestness.” In my view, Dillard doesn’t avoid reverence, nor should she. But it’s true that she is often irreverent, in a way that can be surprising, even somewhat shocking.

Refusing to limit herself to passive observation, Dillard is not averse to disturbing the life she encounters. She wouldn’t, you suspect, heed the pleas of the signs we see at state parks and on hiking trails to leave only footprints and take only memories. She collects praying-mantis egg cases and takes them home. She tries to scare frogs. After reading that ancient Romans believed that echoes could kill bees, she goes for a walk and tries (without success) to test this theory.

Dillard displays a scientist’s penchant for meddlesome experimentation, and it can seem scandalous. Then again, in the course of daily life—driving, flying, shopping—most of us inflict far more damage on the animal kingdom. The harm is merely more indirect than Dillard’s micro-interventions, and we don’t see the consequences, or own up to them. On the rare occasions that we go camping or hiking, we may obey the rules, but our shyness with nature stems from our lack of intimacy with it. In this way, Dillard reveals her kinship with Thoreau. When chopping wood for his house, he writes in Walden, “I was more the friend than the foe of the pine tree, though I had cut down some of them, having become better acquainted with it.”

Similarly, Dillard sometimes comes across as strikingly blasé about suffering. On a trip to a South American village, she relates, she and her party saw a deer struggling to escape from a rope trap. “Its skin looked virtually hairless, in fact, and almost translucent, like a membrane. Its neck was no thicker than my wrist; it had been rubbed open on the rope, and gashed…. The raw underside of its neck showed red stripes and some bruises bleeding inside the skin.” Later, she eats meat from a deer that had been caught in the same manner the previous day. “It was good,” she notes. “I was surprised at its tenderness. But it is a fact that high levels of lactic acid, which builds up in muscle tissues during exertion, tenderizes.” She seems intent to communicate her lack of faintheartedness, her lack of illusion about pain and the satisfactions that depend on it.

* * *

Broadly speaking, essayists tend to fall into one of two camps. The first is epitomized by Montaigne, who is said to have invented the essay form. This kind of essayist arrives at conclusions (if at all) through the course of writing; inquires and probes; invites the reader on an internal, sometimes circuitous journey. As Phillip Lopate— a disciple of this tradition—recently put it in The New York Times Book Review, “The great promise of essays is the freedom they offer to explore, digress, acknowledge uncertainty; to evade dogmatism and embrace ambivalence and contradiction; to engage in intimate conversation with one’s readers and literary forebears; and to uncover some unexpected truth.”

Another kind of essayist already holds a clear and firm belief system, and the essay’s act of discovery comes from applying that outlook to various subjects. Thoreau is one such essayist. The late Ellen Willis is another; her posthumous 2014 collection showcased her enduring commitment to a specific strand of feminism, defined by freedom and pleasure. Dillard, too, is of this second camp—for better and for worse.

The virtue is that her philosophy has a lot to recommend it. We could all stand to be reminded that the world is full of freely given beauty, that rapture is available, and that suffering is nonetheless real. She is wise to insist that we ought to pay attention and “discover at least where it is that we have been so startlingly set down, if we can’t learn why.”

There are, however, troubles with Dillard’s approach. You start to feel a little pummeled by all the exhortations to be astonished. While, for the most part, she represents the opposite of our narcissistic culture, her continual announcements of rapture can seem rather preening. “I am a fugitive and a vagabond, a sojourner seeking signs,” Dillard writes in Pilgrim. “I live in tranquility and trembling.”

Her bravado is at once impressive and off-putting, and it erects a barrier between her and the reader. Put flatly, Dillard is not “relatable” or “likable.” To be sure, likability doesn’t equal literary merit, and she deserves respect for eschewing the tricks that essayists and memoirists use to ingratiate themselves with the reader. But she exudes a whiff of arrogance that can hinder both genuine inquiry and genuine intimacy, as when she casually dismisses the kind of novel that “aims to fasten down the spirit of its time, to make a heightened simulacrum of our recognizable world in order to present it shaped and analyzed.” She goes on, “This has never seemed to me worth doing, but it is certainly one thing literature has always done.”

Dillard is perhaps most interesting, then, when she seems to contradict herself. Her inconsistencies come most often in her meditations on pain, usually intertwined with her reflections on religion. She is not always nonchalant about suffering; at times, she expresses precisely the opposite attitude. In An American Childhood, she recounts that, as an adolescent, she quit the church by sending a letter to the minister, dismaying her parents. The reason for her disillusionment was the utter failure of religious texts to explain suffering: “They offered a choice of fancy language saying, ‘Forget it,’ or serenely worded, logical-sounding answers that so strained credibility (pain is God’s megaphone) that ‘Forget it’ seemed in comparison a fine answer.”

Certain episodes of suffering haunt her. When she was a child, one of her schoolteachers put the cocoon of a huge Polyphemus moth in a Mason jar so that the students could watch it emerge. But something went wrong: The moth didn’t have enough room to spread its wings and was permanently crippled. “He was a monster in a Mason jar. Those huge wings stuck on his back in a torture of random pleats and folds, wrinkled as a dirty tissue, rigid as leather.” Later, at recess, the young Annie saw that it had been let out, onto the driveway next to the playground. “Someone had given the Polyphemus moth his freedom, and he was walking away.” There is something devastating about her word choice here: Insects don’t walk; they fly or they crawl. The image of the walking moth lets us see its disfigurement, while also making it (or, in Dillard’s pronoun, “him”) feel poignantly human.

While this vignette could be read as a parable about humans meddling with nature, Dillard doesn’t dwell on that lesson. Indeed, unlike many contemporary environmental writers, she typically focuses more on the cruelty of nature than the cruelty done to it. She once saw a giant water bug eat a frog alive—sucking its body through a puncture and leaving its skin “formless as a pricked balloon.” This scene also obsesses her, but she eventually reaches a sort of reconciliation:

Cruelty is a mystery, and a waste of pain. But if we describe a world to encompass these things, a world that is a long, brute game, then we bump against another mystery: the inrush of power and light, the canary that sings on the skull…. [T]here seems to be such a thing as beauty, a grace wholly gratuitous.

It is telling that she singles out not love or compassion as the compensation for suffering, but rather beauty. This emphasis dovetails with her conception of the world, and of nature, as a “show,” another word that recurs frequently in her work: “We watch television and miss the show.” In the Roanoke Valley, “New shows roll in from over the mountains and the magician reappears unannounced from a fold in the curtain you never dreamed was an opening.”

Dillard’s writing is also much like a show, often a beautiful one. In felicitous language, she enables us to see the world afresh. But there is always a distance, a sense of performance, and this feeling is reinforced by the curious paucity of people and relationships in most of her work. Pilgrim features only a handful of neighbors, essentially as extras; even the affectionate portraits of her parents in An American Childhood feel like sketches.

But Dillard’s aloofness may be inseparable from what makes her extraordinary: her capacity for solitude and elation, her borderline arrogant detachment from ordinary life and society. As she writes of Tinker Creek, “There are the waters of beauty and mystery…. And these are also the waters of separation. They purify, acrid and laving, and they cut me off.” Alone in her Eden, she ranges “wild-eyed, flying over fields and plundering the woods, no longer quite fit for company.”

Impurity

Posted on: December 7th, 2015 by Rebecca Tuhus-Dubrow

THEY WARNED US about this. In California, the future has arrived in the form of desiccated land, 100-degree autumn days, and freakish fires that burned more than 300,000 acres in 2015. In Oklahoma and Texas, this year brought record deluges of rain, while severe drought in the Middle East has fueled the refugee crisis. As Al Gore is fond of saying, “Every night on the television news is like a nature hike through the Book of Revelation.” And yet, for those of us watching from the comfort of our climate-controlled living rooms, these extremes and calamities coexist surreally with ordinary life. That’s what we didn’t, perhaps, understand: that there would be no before and after to catastrophic climate change, that the dystopian could be so cozy with the quotidian.

This disorienting era has become known as the “Anthropocene,” a term you may have noticed popping up with increasing frequency of late. First popularized in 2000 by atmospheric chemist Paul Crutzen, it refers to the age of human influence over nature on a planetary scale. The term is a little too fashionable these days, but it is useful because it assigns a name to something we all uneasily recognize: a world whose rules are fundamentally different from the ones we were taught. To begin to understand this world, we need more than climate models and glacier measurements. To adapt to it, we need more than seawalls and desalination plants. We need to come to grips with it intellectually, philosophically, emotionally. To do so, we need help. We need books.

Fortunately, a number of authors have stepped up to oblige. The bibliography on life during climate change has swelled in recent years, encompassing work by novelists, philosophers, and literary critics. This fall brings two new entries: After Nature by Jedediah Purdy and Learning to Die in the Anthropocene by Roy Scranton. They are very different, in certain ways opposite. But each offers a powerful reckoning with our bewildering present.

¤

The explicit thesis of Purdy’s After Nature, subtitled A Politics for the Anthropocene, is that we must fortify democracy to survive and thrive in the world we have made. The heart of the book, though, is not political but philosophical. Its great value lies in its sophisticated, lucid study of the evolving American environmental imagination. Purdy, a Duke law professor best known for his 1999 debut For Common Things, brings impressive intellectual and literary chops to bear on a history of American attitudes toward nature, and how those attitudes have manifested in tangible modifications of the air, land, and water. “Law is a circuit between imagination and the material world,” he writes, in one of the book’s many beautiful sentences. The book aims to show how our shared philosophical premises inform our laws, our behavior, and ultimately our world.

Purdy divides the nation’s past relationship to nature into four phases. The first phase he names is the providential, characterized by the belief that God bequeathed us land to settle and transform. Thus in the 19th century, as settlers carved out the frontier, nature on its own was seen as “incomplete”; it took human labor to make it useful and law to turn it into private property. (Native Americans’ notions of collaborating with the land were judged invalid.) Next came the Romantic perspective, exemplified by John Muir, which inverted the providential view. Romantics found the greatest beauty in untouched land, and their efforts led to the preservation of many wild spaces. Purdy is highly critical of this school of thought, faulting it as too remote from both ordinary people and ordinary life: in short, he writes, it translates too easily into an ethos of “vacation and consumption.” A third, utilitarian credo, championed by Theodore Roosevelt among others, sought to manage the country’s forests and other resources responsibly. The goal was worthy, but utilitarians framed this management as a technocratic task that fell to experts: they believed in “administration rather than democracy.” More troubling still, they saw human populations too as resources to be managed, and many of them were ardent eugenicists. Finally, the ecological doctrine, emerging in the early 1960s, stressed the oneness and interdependence of all life, epitomized by Rachel Carson’s revelation in Silent Spring that pesticides permeated far beyond their intended targets. The resulting popular concern precipitated major legislation such as the Clean Air and Clean Water Acts, representing a new kind of environmental law, one that focused more on limiting pollution than regulating land use.

And now? Purdy declines to coin a name for the contemporary American attitude toward nature, possibly because there is no single overriding one. But he does offer a thoughtful inquiry into what an Anthropocene attitude might optimally look like. Today, Purdy writes, we are (or should be) in the process of shifting to a postecological way of thinking:

[P]ristineness and pollution, ecological connection and technological alienation, are blended and are matters of degree … Paradox, partiality, and the mixed-up character of everything have come after the grasp at wholeness that began the ecological age.

We might say (though Purdy doesn’t) that the postecological attitude stands to the ecological as postmodernism stands to modernism: a discourse that continues along the same lines, but without the belief in purity and coherence assumed by its predecessor.

How can we reconcile ourselves to a world where nature is everywhere contaminated? For inspiration, Purdy turns to a surprising source: Henry David Thoreau, who, he points out, has been claimed by Romantics in the past, but who ultimately eludes their constraints. Certain passages from Thoreau, he argues, can be profitably reread “with Anthropocene eyes”:

His Concord is full of the artifacts of old and new settlement, down to the soil itself, seeded with stone tools and potsherds that tinkle against the hoe as he works his bean-field. There is nothing pristine in this place, no basis for a fantasy of original and permanent nature. There is only a choice among relationships with and attitudes toward ever-changed places. These do not just accommodate the damage and ruptures of the landscape: they begin from and depend on them.

Part of what Purdy is up to here is the rejection of dichotomous thinking — natural versus unnatural, beautiful versus ugly — in favor of a vision of the world as impure, blemished, dynamic, alive. Reading After Nature, I was reminded of a gorgeous poem by Adam Zagajewski (translated by Clare Cavanagh) titled “Try to Praise the Mutilated World.” The themes of Zagajewski’s poem are not explicitly or exclusively environmental (The New Yorker ran it in its post-September 11 issue), but upon rereading, I was struck by how much of it resonates with Purdy’s evocation of a nature that is broken but tenacious:

Remember June’s long days,
and wild strawberries, drops of wine, the dew.
The nettles that methodically overgrow
the abandoned homesteads of exiles.
You must praise the mutilated world […]
You gathered acorns in the park in autumn
and leaves eddied over the earth’s scars.
Praise the mutilated world
and the gray feather a thrush lost,
and the gentle light that strays and vanishes
and returns.

To praise the mutilated world — to embrace, in some sense, nature’s desecration — is not defeatism or complacency. The difference is subtle, but Purdy’s preferred attitude toward the Anthropocene promotes equanimity about the inevitable wreckage — the potsherds, the scars — combined with an ability to savor and nurture the beauty and vitality that emerge from it. Purdy quotes another poet, Wallace Stevens, to illustrate the point: “The imperfect is our only paradise.”

In broad strokes, some of Purdy’s analysis will be familiar to readers who are well versed in environmental literature, but his uncommon erudition and richly supple language give the ideas fresh force. Still, a reader might wonder, what should we actually do about the immense challenges of the Anthropocene? Given Purdy’s status as a law professor and the presence of the word “politics” in the book’s subtitle, After Nature is surprisingly light on specifics. (One of his few concrete suggestions is to establish a right to know the conditions in which animals are raised for meat.)

Throughout After Nature, Purdy is more focused on intellectual frameworks than policy proposals. Yet the book is firmly anchored in political conviction. Purdy calls for recognition of the unavoidably political nature of the questions we face. It can be tempting to look for panaceas in market solutions such as carbon pricing or in technologies such as renewable energy or geoengineering. But he emphasizes that these remedies, and the details of their deployment, will always embody, and sometimes obscure, conflicts over values and interests. Inequality in the Anthropocene is not the product of luck or of fate but of human political decisions. Extrapolating from the economist Amartya Sen’s famous observation that famines don’t occur in democracies, Purdy writes that in the Anthropocene, “the world of scarcity and plenty, comfort and desperation, is not just where we live; it is also what we make.” In the past, political decisions may have determined who had adequate shelter from storms; now they may help determine the ferocity of the storms themselves. Accordingly, Purdy insists that we must make the Anthropocene universally democratic. “[I]f Anthropocene ecologies are a political question,” he writes, “then no one should be left out of the decisions that shape them.”

Purdy also critiques the dominant framing of climate change: the assumption that we can “solve” or “prevent” it and that failing to do so will result in apocalypse. He writes,

We should ask, of efforts to address climate change, not just whether they are likely to “succeed” at solving the problem, but whether they are promising experiments — workable approaches to valuing a world that we have everywhere changed, and to thinking about how we will change it next.

There will be many solutions, all of which will generate their own set of problems. We cannot hope to “save” the planet. Talk of “saving” or “solving” sets the standard unreachably high and misleadingly suggests a mission with a single objective — “Operation: Anthropocene,” we might call it — rather than ongoing ways of life that must persist, and adapt, for as long as we remain here.

¤

Purdy, while not necessarily an optimist, is impatient with doomsayers. “Unfortunately,” he writes in the prologue of After Nature,

talk of the Anthropocene has attracted such self-important pronouncements as “this civilization is already dead” … and “if we want to learn to live in the Anthropocene, we must first learn how to die.” This is just the sort of suggestive but, upon scrutiny, meaningless gesture that makes talk of “responsibility” feel self-important and ineffective.

The target of this surprisingly harsh judgment — out of character for the typically gracious Purdy — is a column that Roy Scranton wrote for The New York Times in November 2013, which he has since expanded into a book titled Learning to Die in the Anthropocene: Reflections on the End of a Civilization. Already, you can see why Scranton might have provoked Purdy’s ire: his talk of death and ends represents exactly the sort of apocalyptic overstatement that Purdy makes a habit of interrogating.

Nevertheless, Scranton’s book has its own kind of power. Purdy’s strength is his nuance; Scranton’s is his bluntness. There is something cathartic about his refusal to shy away from the full scope of our predicament. “Human civilization has thrived in what has been the most stable climate interval in 650,000 years,” he writes. “Thanks to carbon-fueled industrial civilization, that interval is over.” Learning to Die is a 144-page slip of a book, part jeremiad, part manifesto, rendered in taut, stern prose. Scranton’s brief chapters do not always proceed in an obviously logical sequence; each is a meditation on a different dimension of human life on this planet (“Human Ecologies,” “The Compulsion of Strife”), and the subject matter ranges from a fascinating exploration of early coal politics to a retelling of the epic of Gilgamesh.

Scranton’s peculiar authority stems in part from his biography, which does not fit the standard profile of a climate writer: he served in Iraq as a private in the US army from 2003 to 2004. Common sense and social science both suggest that, in order to communicate effectively about the issue of climate change, the right messenger — someone who is perceived to share values with the audience — is critical. Yet we are much more likely to hear about the perils of climate change from journalists, scientists, or professors, all regarded by much of the general US public as members of a liberal elite. (Scranton’s current position as a Princeton graduate student may tarnish his populist credentials but can’t entirely erase them.) As Scranton himself reminds us, it’s hardly unprecedented for a member of the Armed Forces to sound the alarm about climate change: Admiral Samuel J. Locklear III, head of the US Pacific Command, has called global climate change “the greatest threat the United States faces.” But to hear a military veteran speaking so forcefully about environmental issues is refreshing. In Iraq, Scranton had personal experience of the hell that he sees spreading in the Anthropocene, as “water, power, traffic, markets, and security fell to anarchy and local rule.” He has witnessed the worst-case scenario, or something close to it, and that lends his voice credibility.

So, in a sense, does his fatalism. He is not asking us to do anything. He has no political agenda. He is pessimistic about grassroots activism and holds out little hope for international agreements. He doubts that renewable energy is even theoretically capable of replacing carbon-based energy. If he has an “ask,” it’s not for a carbon tax or a humbler lifestyle; it’s that we make an effort to save what we can of our cultural heritage, to salvage the hard-won wisdom of the dead, from the Greeks to the Buddha, from the Torah to the Federalist Papers. “The comparative study of human cultures across the world and through time helps us see that our particular way of doing things, right here, right now, is a contingent adaptation to particular circumstances,” he writes, “yet at the same time an adaptation built with universal human templates of meaning-making and symbolic reasoning, with tools and technologies we have inherited from the past.” In a sense, Scranton’s book (unlike Purdy’s) is not directly about nature at all. His concern is with humanity, and with the humanities. He wants us to rescue the fruits of our collective creativity from the blight of our collective destructiveness. The poet his book brings to mind is not Wallace Stevens but T.S. Eliot: “These fragments I have shored against my ruins …”

In his discussion of cultural heritage, Scranton refers in passing to the Greek concept of fate, and he himself seems to take a tragic view, in the classical sense, of climate change. In other words, our fate as a species follows inevitably from who we are; we are as doomed to wreck our planet as Oedipus was to kill his father. “The problem with our response to climate change isn’t a problem with passing the right laws or finding the right price for carbon or changing people’s minds or raising awareness,” he writes. “The problem is that the problem is too big … The problem is that the problem is us.”

¤

Although it’s easy to miss if you skim, there is a hidden spark of brightness in Scranton’s aggressively dour book. True, “[c]arbon-fueled capitalism is a zombie system,” but fortunately, this “is not the only way humans can organize their lives together.” Alternatives are imaginable, if improbable. Furthermore, it turns out that the end of civilization as we know it might not be such bad news. “Learning to die as a civilization,” he tells us, “means letting go of this particular way of life and its ideas of identity, freedom, success, and progress.” Scranton’s use of the word “civilization,” here and elsewhere, is somewhat ambiguous: it is not always clear whether he’s referring to the human culture he prizes, which stretches back thousands of years, or only to the current technocapitalist system. He confidently foresees the demise of the latter. He fears that it will take down the former with it, but holds out hope that it won’t.

This distinction between two kinds of civilization brings to mind a question at the heart of a longstanding debate about climate change and about the term “Anthropocene.” Is it humanity per se that has brought about these massive disruptions, or is it a very specific economic and political system, benefiting a very small subset of people, that is responsible? The first perspective is represented by Elizabeth Kolbert, who argues in The Sixth Extinction that Homo sapiens has always lived in unique disharmony with the environment. The most prominent spokesperson for the second viewpoint is Naomi Klein, whose This Changes Everything: Capitalism vs. the Climate makes the case that global capitalism, and not the human species itself, is the real culprit.

As it happens, both Kolbert and Klein provide blurbs for Scranton’s book, and while he seems to tilt more toward the Kolbertian view, he has moments, as in the passage quoted above, where he sounds more Kleinian. He does seem to believe it’s possible that by changing our economy, social structure, and means of production, we can successfully adapt to the Anthropocene — with the caveat that, given human nature and the damage already done, this is not very likely.

Purdy, too, falls between the Kolbertian and Kleinian poles. He focuses less on indictments, whether of the human species or the capitalist system, and more on the potential for choosing among a variety of futures. He acknowledges that both capitalism and, before its fall, communism inflicted dreadful ecological damage. But he believes that a robust democratic politics is the only way of making a flawed but habitable world together. Purdy puts more stock in human agency and in politics than Scranton does. But both authors agree on the need for major changes to the capitalist system, and to some extent their differences are more tonal than substantive.

In the end, whatever the precise apportionment of blame, it’s clear that humanity is capable of colossal destruction. The corollary, of course, is that we are capable of wisdom, kindness, art, and thought, too, as both these books amply demonstrate. The question is how the balance between our virtues and our vices will play out: whether the human story will turn out to be a tragedy, as Scranton suspects, or the less dramatic, potentially rewarding struggle that Purdy envisions. Either way, we must face the Anthropocene. One of our consolations is to have minds like these two to guide us through it.

A Safe Haven for Whom?

Posted on: November 3rd, 2015 by Rebecca Tuhus-Dubrow

On Christmas Eve, 2007, a blond woman in her late 30s arrived at St. Francis Hospital in Hartford, Connecticut, with a crying newborn in her arms. The woman had given birth alone at home and tied off the umbilical cord with a rubber band. Now she wanted to leave the baby, wrapped in a T-shirt and towel, at the hospital. After answering a few questions—no prenatal care, unclear paternity—the woman, in the words of a Hartford Courant article, vanished.

This striking scene, recounted in Laury Oaks’ new book, Giving Up Baby: Safe Haven Laws, Motherhood, and Reproductive Justice, was made possible by Connecticut’s safe haven law, versions of which were passed in all 50 states between 1999 and 2009. Enacted in response to a perceived crisis of infant abandonment in dumpsters and restrooms, the laws allow women to relinquish babies in designated locations while avoiding criminal charges. The details of the laws vary from state to state, but they share two characteristics: the intent to save babies from a dreadful fate and the guarantee of anonymity for the mother.

At first blush, it is easy to see why these laws passed so readily: They promised a benefit that no politician could decline to champion. According to Oaks, advocates also strategically ensured that the laws had no budgetary strings attached, in order to speed passage. They received an unusual amount of bipartisan support. But they also attracted an eclectic line-up of opponents. Conservatives feared the laws could promote promiscuity; men’s rights groups fretted about the rights of fathers; adoption advocates protested that the children would be denied access to their genetic histories.

Feminist critiques, meanwhile, could fill a book, and Oaks, a feminist studies professor at the University of California–Santa Barbara, has written it. From the outset, she aligns herself with the so-called reproductive justice movement. The term pro-choice, coined by white feminists, is bound tightly to the narrow question of abortion rights and, to some ears, suffers from connotations of consumerism, as though deciding whether to bear a child is no more serious than choosing a brand of nail polish. The more expansive concept of reproductive justice was first advanced in the 1990s by women of color, whose reproductive lives were as likely to be shaped by poverty and coercive sterilization as by lack of abortion access. Quoting another scholar, Oaks defines the philosophy as “the right to have an abortion, the right to have children, and the right to parent those children.” Using this framework, Oaks argues for reforming safe haven laws, on the grounds that they ignore the circumstances that lead to abandonment while reinforcing inequities and stereotypes about who is fit to be a mother.

But Giving Up Baby is not a work of social science so much as cultural criticism. Oaks’ chief method is analysis of the laws’ press coverage and publicity campaigns. She evidently conducted no interviews with advocates or with safe-haven moms. Instead, she offers close readings of newspaper articles, websites, and low-budget promotional videos. Sometimes this narrowly targeted approach leaves conspicuous gaps that cry out for more data or further investigation. To take the most glaring example, she provides few figures on the extent to which the laws have actually been used. But taken on its own terms—Oaks bills her project as a feminist analysis drawing upon “advocacy and media discourses”—the book is largely successful, consistently absorbing, and usually persuasive, at least to a reader who shares her basic political sympathies.

Oaks begins with some fascinating history, reviewing the ways that infant abandonment has always been enmeshed with sexual protocol and social norms. In 19th-century Italy, for example, owing largely to the stigma of illegitimacy, babies were discarded at epidemic levels, their lives jeopardized by wild dogs in cities and pigs in rural areas. During that century, Italy as well as other parts of Europe and the United States established asylums for these unfortunates, who were, with some poetic poignancy, referred to as foundlings. Mortality rates inside these institutions were so high that one anthropologist described them as “highly effective agents of infanticide.”

In modern America, where Oaks focuses her attention, abandonment seems to be rare. Though the numbers are not comprehensively tracked, Oaks cites one 2001 study, which estimated that 105 babies were left in public places in 1998. The same year, 3.94 million American babies were born.

But even a single case of abandonment is difficult to shrug off. The earliest safe haven laws grew from the advocacy efforts of a handful of passionate individuals who were haunted by stories of dead babies. In Yucaipa, California, in 1996, Debi Faris founded the organization Garden of Angels to provide funerals for abandoned babies. “This is what we’ve become as a society,” she told Time. “It is easy to throw our children away.” Everyone can agree that a discarded baby is a sign of grave social ills, but different worldviews emphasize starkly disparate maladies. For a progressive feminist, the problem lies at the intersection of unevenly distributed resources and a lack of access to contraception and abortion; conservative Christians find the root causes in sex outside of marriage and disrespect for innocent life.

Starting with Texas in 1999, safe haven laws swiftly developed into an irresistible trend among state legislatures. Each state stipulates a different window, ranging from three days to one year after birth, during which a baby can be surrendered at a safe haven site, such as a hospital, police department, or fire station. The baby must be unharmed and handed directly to a person. In some states the person receiving the baby is required to ask a few questions for the baby’s health records, though the mother is not required to answer. The mother is not offered any services or support. Oaks includes very little data—and, to be fair, reporting is spotty—but the National Safe Haven Alliance claims that more than 1,000 infants have been turned over to safe havens since their inception.

In essence, safe haven surrenders are a quick path to adoption, without the planning and paperwork, and without the “open” process that has become standard in the U.S. The convenience and anonymity are selling points: A woman without the foresight to plan an adoption, or who may be keeping her pregnancy a secret, might choose a safe haven instead of abandonment. But critics argue that there are good reasons for the red tape and safeguards of formal adoption: to promote thoughtful decision-making, and to provide adoptees with essential self-knowledge. Oaks quotes the radical adoptee advocacy group Bastard Nation, which describes the laws as “anti-adoptee, anti-adoption, anti-child, anti-woman, and anti-family.”

Of course, proponents believe that safe havens are a lesser evil compared to abandonment. The trouble is that we don’t know who is using the laws or why. Women who “dump” their babies are typically panicked and may suffer from mental-health problems. It is a considerable stretch to assume that these women would both be aware of safe havens and have the presence of mind to use them. Instead, it’s possible that women who resort to safe havens would have otherwise kept the child, opted for traditional adoption, or even had an abortion.

Though safe haven laws have no official connection to abortion, Oaks contends that the rhetoric surrounding them muddles the distinction between the fetus and the baby, and privileges both over the experience of the mother. Scrutinizing various publicity materials, Oaks finds that advocates present safe havens as the only acceptable outcome for an unplanned pregnancy. The propaganda not only skirts the possibility of abortion, it doesn’t place any confidence in women (usually portrayed as young and single) who may want to raise their own children despite difficult circumstances. These advocacy efforts, Oaks argues, send an ironic message about maternal love: “Paradoxically, safe haven laws and advocacy urge women to anonymously and permanently relinquish their motherhood status as a way to demonstrate maternal responsibility.”

Oaks bristles at the way the laws treat women. Safe haven babies are prized on the adoption market, in part because healthy American newborns are scarce, and in part, Oaks suggests, because some adoptive parents prefer to have no connection to or knowledge of the birth mother. Thus, safe haven moms, in her view, are treated as vessels for the precious infants, not as people in their own right with their own needs. Oaks denounces a system that assigns the mother “a positive social value only upon relinquishment of a healthy newborn, fail[s] to ensure that she has mental health and postpartum care, and leave[s] her with a stigmatized, secret identity.”

Safe haven moms, by definition, are anonymous, so it’s rare to hear their stories. Oaks did turn up a few online posts purportedly authored by such women. As “Melissa” tells it on the Baby Love Child blog, she gave birth alone at home at age 16 and called an adoption agency, which referred her to a hospital’s safe haven site. She relinquished her son without fully understanding the law. “This law is full of holes,” she wrote. “I have custody of my son now … but the battle of getting my newborn back was horrible.”

Oaks skillfully navigates the complex web of issues, from class politics to notions of maternal love, that intersect with safe haven laws. But her analytical vigor is at times misplaced: She spends pages dissecting the details of relatively obscure publicity materials, occasionally divining overwrought connections. She might have profitably devoted less energy to interpreting YouTube videos and more to grounding her analysis in empirical research. Another sort of author might have hunted for more data at the state level about trends in safe haven use and abandonment, or tried to track down moms to interview.

In any case, she does not promote repealing safe haven laws, but proposes commonsense, modest supplementation. She advocates greater support for reproductive health services “within and beyond safe haven laws,” such as offering the women who relinquish babies postpartum care, mental-health services, and referrals to other resources.

And yet even an ideal re-design of safe haven laws would be a somewhat hollow achievement, given that the policy directly affects so few people. Indeed, just as abandoned babies reveal a much larger constellation of underlying issues, safe haven laws highlight the inadequacies of our response. Reforming them could be a starting point for a more humane approach, but it would still leave many more crucial hurdles to reproductive justice unaddressed. In Texas, where Governor George W. Bush signed the first safe haven bill, state legislators recently passed a law that (if upheld in court) could result in the closure of about half of the state’s abortion clinics.

 

‘Leaving Orbit’ cheers spaceflight’s feats, mourns its fading

Posted on: November 3rd, 2015 by Rebecca Tuhus-Dubrow

In 1969, the moon landing brought together a nation, millions slack-jawed in the glow of their televisions. In 1986, the explosion of the Challenger became a defining moment for a generation of schoolchildren. But how many of us remember the blastoff of Atlantis in 2011 — the final flight of NASA’s manned space shuttle program?

Margaret Lazarus Dean, an English professor at the University of Tennessee, remembers it vividly. In her new book “Leaving Orbit,” she traces the trajectory from the first event to the third — from electrifying ambition to widespread indifference. By the time Atlantis took off, few Americans even noticed that the program was ending.

Dean wrote a 2008 novel, “The Time It Takes to Fall,” centered on the Challenger, but after it was published, her interest in spaceflight grew more consuming. Pondering the end of the space shuttle program, she was gripped by a compulsion to attend the last three launches in Cape Canaveral, Fla. The story of the dawn of the Space Age had been told by some of the giants of narrative nonfiction: Tom Wolfe, Norman Mailer, Oriana Fallaci. Dean — betraying her own grand ambition — wanted to be the one to document its twilight.

“No one has yet tried to grapple with the end that is now in sight,” she writes. “Only when something ends can we understand what it has meant.”

The result is an exuberant, wistful account of the author’s repeated schleps from her home in Tennessee to swampy Florida, an account interspersed with the history of American spaceflight and quotes from its great chroniclers. As for what it has meant, Dean never quite arrives at a sweeping conclusion, but the book is peppered with insights that collectively provide a kind of impressionistic answer.

Consider, for example, this striking image from the launch of Endeavour: “the crowd lifts their voices, sounding a little incredulous, then a little frantic, as though it is their enthusiasm and nothing else keeping Endeavour on its straight path.” Often, Dean finds that our attitudes toward spaceflight embody feelings about other big themes: risk, the future, government. Noting the receptiveness of her students to moon-hoax conspiracy theories, she perceives in these youngsters “an assumption that a government agency can’t have accomplished something so awesome.”

Dean reminds us that spaceflight was a key element of the 1960s’ seemingly limitless optimism and novelty. She summons that time, known as the “heroic era” in space parlance, then ushers us to the more humdrum “shuttle era,” in which the vehicles remained within “Earth orbit.” The heroic era had been spurred by fear of Soviet supremacy, and when the Cold War receded, the ambitions of the program fell prey to competing concerns about funding and safety — especially after the explosion of the Challenger and the less-noticed disintegration of Columbia in 2003. Now, the future is uncertain; NASA still dreams of manned missions to Mars, but Dean is pessimistic that the money will materialize.

Writing about space tends to either glorify the taciturn daring of a few or muse on humanity’s insignificance in the context of the vast universe. Dean’s book, by contrast, celebrates human ingenuity and commitment but dwells less on the astronauts than on NASA’s thousands of anonymous employees who take pride in their minuscule but essential contributions to a majestic project. In fact, the main character, other than the author, is her friend Omar, an “orbiter integrity clerk” charged with guarding the vehicle Discovery. Dean’s only conversation with a shuttle-era astronaut is a phone interview with a woman who has never traveled to space. While this might represent in part a lapse in reporting, it comes across more as a choice to emphasize the less sexy, typically ignored members of an immense beehive-like operation.

Dean’s space obsession, one we tend to consider the province of 8-year-old boys, is an asset, enabling her to convey the emotional allure of the program. But at times her deep immersion in the subject makes the book feel slightly blinkered. She throws around jargon such as “demating” and “payload” without explanations. She expresses anger about the death of the program but advances no cogent case for devoting massive funding to extraterrestrial excursions rather than to pressing needs on this planet. (In one of her finer moments, she writes, “I feel the built-in pointlessness at the heart of Apollo as much as I fiercely admire it — it’s the same pointlessness shared by any artistic gesture.”) Nor does she explore the intriguing question of whether the aspects of space travel she finds most moving — the sense of purpose, the sheer scale of the accomplishment, the unifying effect — need involve space travel at all.

By invoking what might be called the heroic era of nonfiction — also known as new journalism — Dean at once bravely invites comparison to those masters and implicitly casts herself as a humbler, shuttle-era version of Mailer and Fallaci. At her best she is a worthy successor in their common undertaking. “We all feel Norman’s masculine envy at being left behind, but our envy is beside the point,” she writes. “We know that someone needs to stay behind and write about what it feels like to watch it from the ground.”

Tube-Tied

Posted on: November 3rd, 2015 by Rebecca Tuhus-Dubrow

Not long ago, I arrived at an Amtrak station near my home in Southern California and was struck by its pleasant atmosphere: high ceilings, a spare design, a clean and airy waiting room with light streaming in the windows. But as I settled in to wait for my train, I saw it: an enormous television looming on the back wall. Pundits blathered above a ticker of the latest headlines. And a sad trombone sound went off in my head.

The unwelcome TV was not, of course, an anomaly. It was but one manifestation of what I’ve come to think of as TV pollution. In restaurants, airports, office lobbies, they wait for us: televisions, big or small, one or many, playing CNN or The Bachelorette, luring our eyes and in some cases droning into our ears. Indeed, TVs are so omnipresent that—I’m told—many people hardly notice them. I am not so lucky. I find them equal parts seductive and annoying, and based on extensive anecdotal evidence, I know I’m not alone.

We often hear that we are in a golden age of television. Curled up on the couch, we can select from a bounty of first-rate programming; not only that, we now have control over when we watch shows and more power to avoid ads. Outside our homes, however, we encounter the opposite: We’re assailed by commercials and by shows not of our choosing, which tend to be not exactly Mad Men-caliber. Our only choice—at best—is to walk out the door. In places like the airport, the only refuge may be the bathroom.

There’s a dearth of research on longitudinal trends, but TV pollution seems to me to have grown more pervasive in recent years. In 2009, Nielsen reported that “out-of-home” exposure provided a 2.6 percent ratings “lift” for shows; another Nielsen study, from 2014, found a 7 percent to 9 percent lift. I asked a few people who think about TV for a living, and they agreed that public TVs have multiplied. “I know because my own annoyance at them has increased,” said Joy Fuqua, a professor of media studies at Queens College. “And I love television.”

Granted, this nuisance doesn’t rank among today’s more serious woes. But it’s not a frivolous First World problem, either—it’s a question of the way our society treats public space. And like the secondhand smoke to which it’s been compared, TV pollution may have insidious effects we don’t appreciate.

Television is full of stuff—violence, alarming news, Botox overdoses—we may not always be in the mood to see. We may be still less eager for our kids to see it. Over the years abundant research has examined the effects of television viewing, and the studies tend to arrive at results we could have guessed without the benefit of Ph.D.s and labs. Watching lots of violence on TV, it turns out, is associated with aggressive behavior. Kids who see lots of ads for junk food? More likely to eat junk food. One recent survey on Americans’ fears found that “high frequency of television viewing” was one of the most consistent predictors of fear. Television is not benign; like with sun and smoke, we should have some say over our exposure to it.

Encounters with TVs in shared spaces, especially ones showing news shows, usually make me feel agitated and anxious. But what troubles me even more is the assumption underlying them: that we need to be amused or informed by media at every moment. My dinner companions and I may not be the most scintillating conversationalists, but I hope we can keep each other entertained while we share a pizza.

Though it may seem very much a product of our time, television in public places has a long lineage. (The definition of public is squishy—a privately owned café is not the same as a city park—but I’m referring to what the industry calls “out-of-home.”) In fact, as New York University cinema studies professor Anna McCarthy writes in her 2001 book Ambient Television, public TV actually preceded the rise of the household set. In the mid-1940s, taverns began to acquire them. They would advertise this new attraction with neon or cardboard signs; inside, men were asked to remove their hats so as not to obstruct the view. Big boxing matches would draw scores of customers.

Even at the time, they provoked handwringing; people fretted they were ruining the lively tavern banter. But at least they made sense: They were clearly providing a desirable service. Their modern-day successors—sports bars, or bars that offer viewings of Game of Thrones or the State of the Union—are doing the same. In fact, today’s incarnations may actually be more social: as opposed to the postwar barflies who were coming expressly for the TV, today’s customers come primarily for the communal experience. TVs in these bars don’t count as TV pollution.

The problem is that TVs have spread well beyond such convivial settings. The TVs that rankle me—McCarthy’s use of ambient is apt—are the ones that nobody seeks out and nobody can avoid. They’re in places you go for other reasons—to eat lunch, to see the dentist, to do laundry. These don’t encourage social interaction; they elicit mute stares or annoyed sighs. They violate current notions of what makes a great public space: places that foster community and embody local character. TV is an equalizer, diluting any uniqueness with the same talking heads and football games and ads you’d see in any other place.

So why are the TVs so common? Until recently I naively assumed that they were all misguided attempts at an amenity. In some cases, I’ve learned, the motive is more rational, if also more sinister. Think the inevitable news at the airport is intended to helpfully stave off boredom while you wait? Not so much. You’re probably watching the CNN Airport Network, which launched in the early 1990s. Having a captive audience of about 250 million travelers per year reportedly brings CNN more than $10 million annually, and airports share in the ad revenue. The airport isn’t doing you a favor; you’re doing the airport one.

In other cases, proprietors simply believe their customers want TVs. Perhaps their very ubiquity has made them seem like a necessity. But as they’ve proliferated, they’ve also become more redundant. Because most of us now carry a personalized entertainment system in our pocket, we have less appetite than ever for a random reality show playing in the corner; we’re too busy tweeting or playing Peggle. Indeed, TV pollution has a strange relationship to our cultural moment. In one sense it fits seamlessly into an era defined by screens and overstimulation. On the other hand, in an age of hyperpersonalized media consumption, it seems oddly retrograde.

The most compelling defense I’ve heard of public TVs is that they’re one of the last common threads connecting an increasingly fragmented country. They’re the only way we have any clue what other Americans are watching these days. Soon we’ll probably all be wearing holographic glasses, and maybe I’ll be nostalgic for the days when I at least shared an experience—however vexing—with the people around me.

Still, there must be better ways to use public space. Among the friends, acquaintances, and strangers I’ve surveyed, the most common response to this supposed perk is vehement irritation. Entrepreneur and hacker Mitch Altman hates the phenomenon so much that he invented TV-B-Gone, a “universal remote control” capable of turning off most televisions.

It’s certainly possible that I’m overestimating the extent of TV aversion. Plenty of Americans keep the TV on at home, and I know some feel more receptive to its public incarnation than I do. But Altman has talked to many people about this topic over the years, and though he’s obviously not unbiased, he told me the annoyance cuts across subcultures and viewing habits. After all, when you’re at a restaurant, you have no remote-control privileges. Is it really self-evident that the inclination to watch TV should trump the prerogative to be free of it?

While in my own utopia I would abolish most TVs outside the home, I now offer a humbler plea. Business owners, don’t assume that your customers want a television or eight. Know that you may in fact be losing business. I have walked out of many sandwich joints after catching a glimpse of a TV. My friend’s barber showed violent movies, including one in which a throat was slit while the barber was shaving my friend’s neck. My friend switched barbers.

And my message to my fellow TV haters: Don’t be shy; don’t just grumble under your breath. Ask politely if it’s possible to turn the TV off or down. Mention courteously that you’d rather not keep up with the Kardashians right now.

I have taken this approach, and I admit it hasn’t always yielded the desired result. The manager at that Amtrak station finds me as exasperating as I find the TV. But if more of us say something, I hope he’ll realize that I’m not a curmudgeonly freak—or at least that I’m not the only curmudgeonly freak.

For those without the patience to win hearts and minds, well, there’s now a TV-B-Gone-inspired app, whose commands include mute, volume adjustment—and, of course, off.

Utopian Visions on a Wraparound Screen

Posted on: June 11th, 2015 by Rebecca Tuhus-Dubrow

AT THE 2010 Shanghai World Expo, audience members strapped themselves into moving seats to watch a 360-degree screen. The scene on display was a quixotic vision of the city in 2030. “Congestion, pollution, accidents have all been eliminated,” writes Anna Greenspan in the preface to her new book Shanghai Future. “Instead, future citizens float through a vast sci-fi cityscape of spectacular skyscrapers and highway overpasses in intelligent, battery-powered pods. […] Outside the view is of green parks, blue skies, clean beaches and giant wind farms.”

The Expo, modeled on the American and European World Fairs of the 1850s through the 1930s, generated frenzied excitement in China and cost billions of yuan. In fact, it was the most expensive and widely attended planned event in human history, and it was intended, Greenspan writes, to establish Shanghai as the great metropolis of the 21st century. The irony was that to the few Western observers who bothered to notice, the event instead signaled China’s latecomer status to industrial modernity. Not only do we in the West now see World Fairs as objects of kitschy nostalgia; “it is the spirit of futurism itself,” Greenspan notes, “that seems so remarkably out of date.”

Greenspan, a Canadian philosopher who has been based in Shanghai for most of this century, begins her book with a compelling examination of this episode. She is right to note that we in the West have lost our optimistic futurism. If we think of the future at all, we probably imagine a world like our own, but more so: more disasters, more spectacular violence; more technological innovation, certainly, but of the kind that seems to estrange us from each other and from nature. The giddy optimism of our past — of the GM Futurama exhibit at the 1939 New York World’s Fair — is it alive in the city once known as the “Whore of the Orient”? If so, is this because China hasn’t caught up yet to our cynicism? Do we know something they don’t? Or is it the other way around?

Greenspan argues that China is not simply repeating the trajectory of the West, eighty years later. Instead, she believes something more interesting is going on, and she ties it to a distinctive relationship with time. The traditional Chinese conception of time is less linear than the West’s, she writes, resembling a “spiral” more than a line. Shanghai’s peculiar history also plays a role. After a florescence of modernity in the 1920s and 1930s, this heyday was ended abruptly by the Japanese invasion, followed by the Maoist Revolution. The city then went through a long period of neglect, until the 1990s, when it returned to a kind of modernity on steroids. (Greenspan says little about Shanghai’s 19th-century division into a patchwork of Western concessions, in which foreigners enjoyed immunity from Chinese law, to China’s enduring humiliation — a period that seems relevant as it points to profound Western influence on the city from the start of its modern era. This period, which has been the subject of a great deal of scholarship, gets considerable attention in A History of Future Cities, a 2013 Norton book by Daniel Brook that also covers modernity and urban transformation.)

When Shanghai returned to the international stage late in the 20th century, Greenspan writes, it “embraced a futurism that the city felt had passed it by.” The city is steeped in nostalgia for a lost “golden age,” an attitude that sometimes conflates building the future with reviving the past. “China’s great city is not only influencing what will take place in the future,” Greenspan concludes, “it is also transforming the very idea of the future itself.”

Greenspan then turns to other aspects of 21st-century Shanghai. She explores the relationship between the city’s hypermodern facade and the hidden “micro-commercial” activity happening in the shadows of the glassy skyscrapers. We take a look at the efforts underway to foster a more creative culture — from “made in China” to “created in China” — and the ways the Communist government both fosters and thwarts such efforts. We visit suburbs such as “Thames Town” that attempt to clone European villages. The book weaves together the author’s own observations of Shanghai with discussions of urbanist figures such as Le Corbusier, Haussmann, and Robert Moses, as well as quotes from contemporary theorists.

Shanghai Future is strongest when integrating the more theoretical insights with colorful details that bring the city to life. Greenspan describes the tacit accommodations between the chengguan (city inspectors) and street vendors. When the inspectors are on duty, the vendors disappear. “Their ‘face’ preserved and their duty done, the inspectors happily go off to eat or rest. As soon as they leave, the vendors return to their spots.” Thames Town features British phone booths and statues of Florence Nightingale and Harry Potter; busloads of brides arrive in white dresses with their grooms to be photographed against this backdrop. Far from inauthentic, the flair for inspired mimicry is, Greenspan argues, an element of Chinese culture in its own right. The book is an illuminating primer on Shanghai.

That said, the book loses momentum in places. In part, the intriguing points Greenspan raises at the outset — about the future and temporality — are not satisfyingly elaborated or resolved. How does this sense of time play out in everyday lived experience in the city? How will Shanghai shape the future or the idea of the future? More Chinese voices might have helped answer these questions. There is scarcely a quote from an ordinary Shanghai citizen.

The book is also symptomatic of a larger issue: it’s surprisingly difficult to make writing about cities engaging. (I say this as someone who writes about cities, and I fully implicate myself.) The reason, I think, is that such writing tends to be analytical rather than narrative, and to be about systems rather than characters. The best way to get readers to turn pages is to make them wonder: what happens next? Especially, what happens next to this person who has captured my interest? (It’s not an accident that one of the most popular books about urbanism of all time is a biography: Robert Caro’s life of Robert Moses.) Of course, plenty of other writing also lacks narrative and characters: policy analysis, biology textbooks. But the difference is that cities are inherently interesting and accessible. We know cities and love cities, and they are full of people and stories. So the disjunction can be jarring. Greenspan is by no means uniquely guilty, then, but nor is she immune from this problem.

And recent years have brought an explosion of writing about cities, much of it valuable even if it isn’t compulsively readable. Which brings me back to the question of attitudes toward the future. While we muddle through our “end-of-history torpor,” as New York Times columnist Ross Douthat recently dubbed our malaise, a growing consensus seems to hold that if there is any hope for us, it lies in cities. An upbeat urbanism is flourishing in the US.

The movement is based on several premises: our national government is broken, and we must turn to local government for problem-solving; since more and more people are urban-dwellers, cities must be the locus of addressing major issues, notably climate change; and cities, properly designed and planned, have the potential to offer the most sustainable way of living. Finally, there are murmurs that notwithstanding the enormous investments needed to transform our own cities, the directions taken by the megalopolises of Asia and the Global South will prove decisive in shaping the future of the planet.

With all this in mind, the world of 2030 depicted at the Shanghai Expo — the blue skies, the giant wind farms — is a vital ideal, very much in tune with the goals of American urbanism. The difference is that in the US we are more likely to see blog posts on individual small-bore innovations than a utopian vision on a wraparound screen. But there’s something to be said for that wraparound screen — or at least for the big-picture thinking it represents.