Archive for the ‘Book Reviews & Essays’ Category

Utopian Visions on a Wraparound Screen

Posted on: June 11th, 2015 by Rebecca Tuhus-Dubrow

AT THE 2010 Shanghai World Expo, audience members strapped themselves into moving seats to watch a 360-degree screen. The scene on display was a quixotic vision of the city in 2030. “Congestion, pollution, accidents have all been eliminated,” writes Anna Greenspan in the preface to her new book Shanghai Future. “Instead, future citizens float through a vast sci-fi cityscape of spectacular skyscrapers and highway overpasses in intelligent, battery-powered pods. […] Outside the view is of green parks, blue skies, clean beaches and giant wind farms.”

The Expo, modeled on the American and European World Fairs of the 1850s through the 1930s, generated frenzied excitement in China and cost billions of yuan. In fact, it was the most expensive and widely attended planned event in human history, and it was intended, Greenspan writes, to establish Shanghai as the great metropolis of the 21st century. The irony was that to the few Western observers who bothered to notice, the event instead signaled China’s latecomer status to industrial modernity. Not only do we in the West now see World Fairs as objects of kitschy nostalgia; “it is the spirit of futurism itself,” Greenspan notes, “that seems so remarkably out of date.”

Greenspan, a Canadian philosopher who has been based in Shanghai for most of this century, begins her book with a compelling examination of this episode. She is right to note that we in the West have lost our optimistic futurism. If we think of the future at all, we probably imagine a world like our own, but more so: more disasters, more spectacular violence; more technological innovation, certainly, but of the kind that seems to estrange us from each other and from nature. The giddy optimism of our past — of the GM Futurama exhibit at the 1939 New York World’s Fair — is it alive in the city once known as the “Whore of the Orient”? If so, is this because China hasn’t caught up yet to our cynicism? Do we know something they don’t? Or is it the other way around?

Greenspan argues that China is not simply repeating the trajectory of the West, eighty years later. Instead, she believes something more interesting is going on, and she ties it to a distinctive relationship with time. The traditional Chinese conception of time is less linear than the West’s, she writes, resembling a “spiral” more than a line. Shanghai’s peculiar history also plays a role. After a florescence of modernity in the 1920s and 1930s, this heyday was ended abruptly by the Japanese invasion, followed by the Maoist Revolution. The city then went through a long period of neglect, until the 1990s, when it returned to a kind of modernity on steroids. (Greenspan says little about Shanghai’s 19th-century division into a patchwork of Western concessions, in which foreigners enjoyed immunity from Chinese law, to China’s enduring humiliation — a period that seems relevant as it points to profound Western influence on the city from the start of its modern era. This period, which has been the subject of a great deal of scholarship, gets considerable attention in A History of Future Cities, a 2013 Norton book by Daniel Brook that also covers modernity and urban transformation.)

When Shanghai returned to the international stage late in the 20th century, Greenspan writes, it “embraced a futurism that the city felt had passed it by.” The city is steeped in nostalgia for a lost “golden age,” an attitude that sometimes conflates building the future with reviving the past. “China’s great city is not only influencing what will take place in the future,” Greenspan concludes, “it is also transforming the very idea of the future itself.”

Greenspan then turns to other aspects of 21st-century Shanghai. She explores the relationship between the city’s hypermodern facade and the hidden “micro-commercial” activity happening in the shadows of the glassy skyscrapers. We take a look at the efforts underway to foster a more creative culture — from “made in China” to “created in China” — and the ways the Communist government both fosters and thwarts such efforts. We visit suburbs such as “Thames Town” that attempt to clone European villages. The book weaves together the author’s own observations of Shanghai with discussions of urbanist figures such as Le Corbusier, Haussmann, and Robert Moses, as well as quotes from contemporary theorists.

Shanghai Future is strongest when integrating the more theoretical insights with colorful details that bring the city to life. Greenspan describes the tacit accommodations between the chengguan (city inspectors) and street vendors. When the inspectors are on duty, the vendors disappear. “Their ‘face’ preserved and their duty done, the inspectors happily go off to eat or rest. As soon as they leave, the vendors return to their spots.” Thames Town features British phone booths and statues of Florence Nightingale and Harry Potter; busloads of brides arrive in white dresses with their grooms to be photographed against this backdrop. Far from inauthentic, the flair for inspired mimicry is, Greenspan argues, an element of Chinese culture in its own right. The book is an illuminating primer on Shanghai.

That said, the book loses momentum in places. In part, the intriguing points Greenspan raises at the outset — about the future and temporality — are not satisfyingly elaborated or resolved. How does this sense of time play out in everyday lived experience in the city? How will Shanghai shape the future or the idea of the future? More Chinese voices might have helped answer these questions. There is scarcely a quote from an ordinary Shanghai citizen.

The book is also symptomatic of a larger issue: it’s surprisingly difficult to make writing about cities engaging. (I say this as someone who writes about cities, and I fully implicate myself.) The reason, I think, is that such writing tends to be analytical rather than narrative, and to be about systems rather than characters. The best way to get readers to turn pages is to make them wonder: what happens next? Especially, what happens next to this person who has captured my interest? (It’s not an accident that one of the most popular books about urbanism of all time is a biography: Robert Caro’s life of Robert Moses.) Of course, plenty of other writing also lacks narrative and characters: policy analysis, biology textbooks. But the difference is that cities are inherently interesting and accessible. We know cities and love cities, and they are full of people and stories. So the disjunction can be jarring. Greenspan is by no means uniquely guilty, then, but nor is she immune from this problem.

And recent years have brought an explosion of writing about cities, much of it valuable even if it isn’t compulsively readable. Which brings me back to the question of attitudes toward the future. While we muddle through our “end-of-history torpor,” as New York Times columnist Ross Douthat recently dubbed our malaise, a growing consensus seems to hold that if there is any hope for us, it lies in cities. An upbeat urbanism is flourishing in the US.

The movement is based on several premises: our national government is broken, and we must turn to local government for problem-solving; since more and more people are urban-dwellers, cities must be the locus of addressing major issues, notably climate change; and cities, properly designed and planned, have the potential to offer the most sustainable way of living. Finally, there are murmurs that notwithstanding the enormous investments needed to transform our own cities, the directions taken by the megalopolises of Asia and the Global South will prove decisive in shaping the future of the planet.

With all this in mind, the world of 2030 depicted at the Shanghai Expo — the blue skies, the giant wind farms — is a vital ideal, very much in tune with the goals of American urbanism. The difference is that in the US we are more likely to see blog posts on individual small-bore innovations than a utopian vision on a wraparound screen. But there’s something to be said for that wraparound screen — or at least for the big-picture thinking it represents.

Speed Kills

Posted on: April 26th, 2015 by Rebecca Tuhus-Dubrow

Not long ago, while crashing with my parents for a few days, I had the opportunity to sift through a wicker box stuffed with memorabilia from my youth: cards, letters, notes scribbled furtively in class. I’m not that old—the historical period in question was the 1990s—but the exercise felt like stepping back into an ancient era. There were letters from old flames, grandparents and, disproportionately, my childhood best friend; creased paper and smudged ink, the occasional drawing, inside jokes that still made me laugh. An erstwhile sentimentalist, I had saved paper fragments with the scrawled names of people I could no longer place (who was Sharon?) and their phone numbers (most of which, quaintly, had no area codes). For several evenings, until I’d touched and pored over every last scrap, the box seduced me away from my usual nighttime dabbling in work and diversion.

Throughout the hours that I spent with this paper archive, my experience of time seemed to change. I was fully absorbed in each old document I encountered. I was also reminded of a period of my life when I had a very different relationship with time: I would kill entire afternoons scribbling bad poems in my journal, or napping to the tunes of obsessively curated mix tapes.

Now, I can nod along to the refrains about how quickly time passes, how busy life is. What changed? Was it just that then I was a kid, and now I have a kid? Time no longer seems unlimited: then I had possibilities; now I have responsibilities. Is it because I was blessed (or so it seems to me) to be in the last generation to grow up without the intrusions and distractions of the Internet and cellphones? Or maybe I’m just misremembering it all through a haze of nostalgia. (It may even be possible that the ’90s were not objectively the greatest decade for music and culture in my lifetime, but merely the time when I happened to be 16. Unlikely, but theoretically possible.)

On a larger social scale, we have been observing similar patterns and asking similar questions for some time. Why are we so busy, and why do we have so little time? Is it because our gadgets and gizmos have accelerated the pace of life? Much as I have had to assume the obligations of adulthood, do we all shoulder greater burdens now than we did before hyper-globalization and the shriveling of the welfare state? Or are we just engaging in collective nostalgia for a simpler time that never really existed and that, in fact, we don’t even really want to inhabit?

* * *

These are some of the questions Judy Wajcman takes up in her bracing, if not altogether convincing, new book, Pressed for Time. The title is somewhat misleading, for Wajcman’s argument complicates the prevailing idea that we have less time than ever and that technology is to blame. She is impatient with unsupported assumptions about how people spend time. She points out that people’s experience of time is heterogeneous, and that those who dominate public discussion of the issue tend to generalize based on their own experience. (The underemployed, for instance, may not be so busy; the notion of acceleration may not resonate for care workers, because the pace of care is inherently slow.) She also points out that theorists on the subject have seldom drawn on empirical research regarding “time practices.”

In her slender but dense book, Wajcman aims to remedy all of these failings. She tries to talk about time in more precise and rigorous language, some of which is jargony but often useful nonetheless. For instance, we all have twenty-four hours in a day; what we really want is not more time but “temporal sovereignty,” the ability to choose how we spend our time. Shunning abstract formulations, she aims to take a closer look at the texture of our experience of time.

Wajcman, an Australia native and a sociology professor at the London School of Economics, is an important figure in the field known as science and technology studies (STS). Its presiding conviction is that “all technologies are inherently social in that they are designed, produced, used and governed by people.” Much of this book summarizes, and sometimes critiques, the insights from her field. Though Wajcman frowns upon muddled thinking and unempirical claims, her bête noire is what she calls “techno-determinism”: the notion that technology operates as an independent force influencing society from the outside. Instead, she sees a “dialectical process of promise, resistance, improvisation, and accommodation.”

In addition to her suspicion of oversimplification, Wajcman delights in paradox and complexity. She points out that for all the focus on hypermobility, human bodies are increasingly stationary, sitting in front of screens and steering wheels. Similarly, “increase in speed increases the potential for gridlock.” Cars promise liberation and exhilaration, but the more cars are on the road, the less they can fulfill that promise: “The irony is that a horse and buggy could cross downtown Los Angeles or London almost as fast in 1900 as an automobile can make this trip at 5 p.m. today.”

More such examples pepper the book. Her points aren’t jaw-droppingly counterintuitive; rather, they tend to crystallize nuances or contradictions of which we are vaguely aware from our own daily life but that rarely get articulated. Many of these observations come from her colleagues, whom she cites. The ideas might be familiar to specialists in STS, but fresher to lay readers.

* * *

Wajcman approaches her subject rather like a mystery: Why do we sense that life has gotten faster and that we have less time? In fact, several surveys indicate that working hours have not increased for most people and that we have more leisure time than before. And yet, survey data also support the notion that our subjective experience of time is more harried than in the past. From 1965 to 2005, the percentage of Americans reporting that they always feel rushed climbed from 25 percent to 35 percent, and nearly half now say they almost never have enough time on their hands.

What’s going on? Wajcman identifies several intertwining changes that affect not necessarily the quantity but rather the quality of available time, making it feel more disorganized and fragmented. One factor is the dissolution of standard schedules. The 9-to-5 office or factory job, as the most obvious example, has given way to irregular hours, extended hours, working from home and so on. As Wajcman puts it, “collective social practices, derived from institutionally stable temporal rhythms, have been eroded.” This shift makes it challenging “to mesh work schedules with the social activities of friends and family…. More negotiations, more decisions, and more effort are required to perform the necessities of daily life.”

At the same time, we have the rise of the dual-earner household. Dad isn’t coming home at 5:30 anymore, and in any case Mom wouldn’t be there waiting to serve him a cocktail. (This may never have been true for most families; now the image is more dusty relic than sitcom cliché.) Both partners are likely to have schedules that vary from the old standard in different ways. We’ve also seen a concurrent change in culture for children and adolescents, although Wajcman doesn’t delve into the well-worn territory of the “over-scheduled child.” Thus, coordinating activities becomes an activity in its own right. This is to say nothing of the rise in single-parent families, which, for obvious reasons, suffer from their own distinctive time pressures.

And then there’s technology, which Wajcman acknowledges has wrought changes. But she insists they are not as simple as we sometimes claim. When examining the effect of technology on the pace of life, she finds another paradox, another mystery. “But hang on a moment. Weren’t modern machines supposed to save, and thereby free up, more time?” It is by no means inevitable that technological advances should accelerate the pace of life; on the contrary, once you think about it, the opposite result seems more intuitive.

And yet the connection is not necessarily off the mark. What happens, of course, is that the introduction of technology into our lives changes our expectations. Consider the washing machine. Wajcman begins one chapter with a provocative epigraph, a quote from economist Ha-Joon Chang: “The washing machine has changed the world more than the Internet has.” This assertion appeals to Wajcman, because it suggests the importance of an ordinary household artifact and of housework, both of which are typically overlooked. Still, she also questions how much time the washing machine and comparable appliances really save. Instead, she finds, the introduction of the washing machine changed the culture so that people had higher standards for cleanliness. At the same time, household labor (for women) began to be valorized. “In other words, it may be that appliances are being used to increase output rather than to reduce the time spent on housework.”

Similarly, with the advent of e-mail, texting and social media, our expectations of response time changes. Because it is possible to communicate instantaneously, we expect immediate responses. Instead of (or in addition to) using the devices to save time, we use them to communicate more—to increase our “output” of communication.

Wajcman finds some truth in the charge that technology is playing a role in the felt acceleration of life. But her analysis of the empirical research on time use leads her to challenge other assumptions often taken for granted. We fear that constant connectivity means the encroachment of work into the home and family life. While this issue certainly exists, Wajcman finds that cellphones are primarily social tools, and they allow people to communicate with friends and family during work hours. In part, this communication enables people to make plans and coordinate schedules, thereby saving time: “by allowing some of the concerns of family and personal life to be handled during the working day, they might even be deployed to reduce time pressure.”

She agrees with the popular perception that boundaries between work and home are blurring. But she doesn’t exactly lament this development. Indeed, she points out that the division between work and home, public and private, was a creation of industrialization, which perhaps reached its apex in 1950s American suburbia and is not necessarily an ideal to be preserved.

Likewise, she discusses the dissolution of other boundaries in her chapter on texting. She conducted her own study on the texting practices of teens in Australia. In contrast to the common complaint that teens are not fully present during family time, she proposes: “Perhaps, in a distinctive manner, young people are now able to concurrently experience family time and time with friends.” In the same vein, Wajcman recalls seeing, at a nursing home, a daughter with one arm slung around her elderly mother, the other tapping on her smartphone. Though Wajcman acknowledges an initial negative judgment of this scene, she quickly reconsidered. The elderly mother was clearly not very aware of her surroundings and was likely comforted by her daughter’s presence. The daughter was able to provide this solace while engaging in other activities. (She could also have been reading a book or magazine.) Is this really to be condemned?

* * *

My guess is that many of us instinctively disapprove of the multitasking daughter, as well as the teens texting during dinner time—and also that most of us have done similar things ourselves. As our constant lamentations about our love of speed suggest, what we really feel is profound ambivalence. One of the most valuable contributions of Wajcman’s book is to explore that ambivalence.

One intriguing theory posits a connection between speed and social progress. “Our common sense notion of ‘modern’ denotes a historical process of steady advance and improvement in human material well-being, occasioned by technological innovation,” Wajcman writes. She cites one of her colleagues as arguing that progress depends on “impatience with the way things are…. Change thus comes to be valorized over continuity…the speed of change becomes a self-evident good.” As experienced in everyday life, sometimes speed itself may be alluring. But not always: it may be, instead, a matter of a “cultural ‘bargain’ with modernity.” We may love speed or hate it (or both), but either way, we see it as intimately related to the benefits of modern life that we are reluctant to relinquish.

Wajcman devotes less attention to why we might rue speed. But by the same token, modernity and technology have always been associated not just with social progress but also with destruction and violence. Speed is arguably synonymous with modernity, and as Marshall Berman argues in his classic book All That Is Solid Melts Into Air, ambivalence has always been the dominant—and appropriate—response to modernity.

There is also a more specific class dimension to the pace of daily life. As opposed to the “leisure class” of yore, intense work and what Barbara Ehrenreich has dubbed “conspicuous busyness” are associated with high status. (Fat, of course, was once a status marker, until ordinary people got fat; perhaps as the proles won shorter working hours, something analogous happened to leisure.) It may be that some of us claim to be busier than we are—or that, if we find ourselves with time, we rush to fill it. Thus, unscheduled time may itself lead to feelings of stress rather than relaxation. If you aren’t very busy—or, preferably, “crazy busy”—you must not be very important. When we complain about how busy we are, it may be sincere, but it is also a kind of humblebrag.

Yet Wajcman recognizes that fast-paced, busy lives may provide us with genuine satisfactions, even as they leave us frazzled: “action-packed lives,” she writes, can be “both stressful and affirming.” Similarly, our current time practices stem from certain societal changes that we can’t or wouldn’t want to roll back. At their root, we find some feminist triumphs: more women pursue professional success, while more men invest more time in parenthood. Though these victories are partial (Wajcman notes that women tend to feel more harried and more compelled to multitask than men), this is a sign of progress. But both professional achievement and parenting take time—lots of time. (We should also keep in mind the single working mother for whom a hectic schedule is a necessity, not a lifstyle choice.)

This ambivalence and the reasons for it—the fact that we welcome progress, increased convenience and the sheer excitement of speed in different forms—are so deeply entangled with detrimental effects (the stress and feelings of disaffection that come with living a mediated life, the actual physical danger of speed in some cases) that the phenomenon is interesting to analyze but difficult to address. Accordingly, the prescriptive portion of Wajcman’s book is considerably weaker than the descriptive part. She makes a few gestures at drawing out the implications of her analysis: “the process of technical innovation and design needs to be opened up to reflect a broader range of societal realities and concerns.” But she does not elaborate on how we might carry out this rather vague idea. And in fact, much of her analysis seems to imply that we are doing just fine. In her zeal to challenge dogmatic condemnations, she sometimes errs too much on the side of uncritical celebration. She shows that we are not passive victims of technological innovation, but rather use our gadgets creatively to maintain intimacy. She demonstrates fairly convincingly that we aren’t doing as badly as we think, but she doesn’t leave us with a sense of how we might do better still.

This lacuna is all the more striking because her discussion of ICTs (information and communications technologies) is almost exclusively about the C. She discusses e-mail, cellphones and texting, but says almost nothing about our consumption of information, in particular media and social media, which combines elements of I and C, and which has become such a dominant part of the landscape. In both the news and social media, speed has had costs: as news outlets rush to churn out content, that content becomes increasingly sloppy and shallow. She also doesn’t acknowledge that our gadgets and social-media sites are operated by corporations that stand to profit from our addictions to them.

* * *

We don’t have less time than ever; on the contrary, life expectancy has steadily increased. What we have, at this latest point so far in human history, is more of so much else—more people, more books, more cultural products of every kind, in addition to the staggering volume of online content. We feel ever more acutely the mismatch between available time and all the possible ways we could spend it. Population growth has overlooked effects: even if Steven Pinker is right that per capita violence has declined, something horrible is always happening to someone, and thanks to our ICTs, we’re going to hear about it in “real time.” This fosters a sense of relentless drama, of the world spiraling out of control, and chronic low-grade anxiety.

And yet, despite the ostensible constant novelty—new information, new communication, new techno-toys—there is a numbing sameness to the experience of daily life for many of us. Too much of life is spent in the same essential way: clicking and typing and scrolling, liking and tweeting, assimilating the latest horrors from the news. And this relates back to the speed of time’s passage. True experiential variety, the social scientists tell us, is what gives life the feeling of passing more slowly—getting out of our routines, having adventures. It’s when the days pass by in a barely distinguishable blur that we look back and think, “Where did the time go?”

When Wajcman critiques technological determinism, she emphasizes that social practices shape the use of technology. This sounds empowering, but it can also be oppressive. As a late adopter, I can attest to the social pressure that eventually makes it an effective necessity to buy the big, new tech product or participate in the latest hot social-media site. Yet with every social act, and however infinitesimally, we either buttress or erode a social norm, or begin to establish a new one. A universal slowdown or a global unplugging is neither desirable nor achievable; but, as Wajcman stresses, a more reflective relationship with time and technology is both. Last week, I sent dozens of e-mails, but I also wrote a letter to my best friend, in pencil, on paper. I texted my husband at work, but I responded to a text message from my downstairs neighbor by ringing her doorbell. These things take more time, but they also give us memories that enrich it. Some of us are lucky to have more temporal sovereignty than we think.

The Eco-Optimists

Posted on: February 4th, 2015 by Rebecca Tuhus-Dubrow

An optimistic environmentalist may sound like an oxymoron (or perhaps just a moron). In recent months alone, headlines have spotlighted the irreversible melting of the West Antarctica Ice Sheet and the latest report from the UN IPCC, which noted that the effects of global warming are already worse than previously predicted. Daily extinctions of species do not even make the news. And the U.S. midterm elections in November handed power to some of environmentalism’s most hostile foes.

Yet expressions of optimism have been popping up in various green quarters. In June Al Gore published an article in Rolling Stone titled “The Turning Point: New Hope for the Climate,” hailing “surprising—even shocking—good news” about a shift toward a solar-powered future. “[I]t is now clear that we will ultimately prevail,” he declared. September’s climate march in New York exceeded expectations, attracting some 400,000 people and spurring pronouncements that a mass movement had finally arrived. Longtime New York Times environmental reporter (now blogger) Andy Revkin has also attracted attention for his relatively upbeat outlook. “We are going to do OK,” he told an audience of environmental science researchers last summer.

Of course, different optimisms have different sources and different implications. Gore’s is relatively narrow: it’s based on diffusion of a particular technology, and the triumph he predicts (while somewhat ambiguous) is presumably that human civilization will survive. A more expansive vision, coming from the left wing of the climate movement, is found in Naomi Klein’s new book This Changes Everything. Her professed optimism derives, in a sense, from horror at the status quo, which she feels is becoming so intolerable for so many that we might actually do something about it. Klein proposes that the devastation of climate change can serve as a catalyst for a broader social justice movement that will deliver us to a world better than the one we now inhabit—less exploitative of the vulnerable of all species, human and otherwise.

But perhaps most provocative are the worldviews that ground their optimism in a reconsideration of our relationship to the natural world. A couple of emerging sub-movements share certain familiar green principles but challenge others. They highlight the value and the pitfalls of optimism for social movements generally, but also the unique challenges for environmentalism. And they raise questions about what it means to be an environmentalist when the environment is rapidly changing.


It’s not just the relentless bad news in recent years that makes environmental optimism difficult. Environmentalism as a school of thought is prone to pessimism. It couldn’t exist until humans had begun to destroy nature on an appreciable scale. In reaction to this destruction, early environmentalism—with its conservationist roots—often depicted man as nature’s enemy. In his 1905 book Man and the Earth, for example, Harvard geologist Nathaniel Southgate Shaler wrote, “as he mounts toward civilization, man becomes a spoiler.” This ethos, while only one strand of environmentalism, holds powerful sway. Indeed, the advent of climate change has seemed to vindicate it and push it to the extreme: human actions are wreaking not only local damage but planet-wide havoc. By this logic, the ideal outcome for the planet would be our immediate disappearance. As long as humans are present, the best we can hope to do is restrain ourselves, do less harm, shrink our footprint.

The justifications for this viewpoint are obvious—everywhere we look, we see evidence that man is indeed a spoiler—but it has its drawbacks. From this perspective, it’s hard to imagine a healthy way for people to relate to the planet. It’s even harder to envision a future we can get excited about fighting for—let alone see a way we can win. As British environmental writer George Monbiot argued in the Guardian in June, we need a framework “that proposes a better world, rather than (if we work really hard for it), just a slightly-less-shitty-one-than-there-would-otherwise-have-been.”

But under the circumstances, how can we conceive of a better world?

One possibility is to change our perspective and redefine nature. This is the path taken by a loosely affiliated group of scientists, writers, and activists sometimes called “New Conservationists” or “eco-pragmatists.” While acknowledging the crises our planet faces, they take a less value-laden view of our relationship with nature than their more traditional counterparts. The crux of their philosophical departure is that they do not see human impact as bad by definition. Indeed, they explore the ways that human influence could even be positive. In essence, they are able to be optimistic by altering—critics would say lowering—their standards.

Among the New Conservationists is science writer Emma Marris, who makes an eloquent case for this point of view in her 2011 book Rambunctious Garden: Saving Nature in a Post-Wild World. Marris’s notion of a “rambunctious garden” captures two ideas. One is that pristine wilderness no longer exists; we must accept that the entire planet has been affected by humans and therefore embrace our role as “gardeners” or responsible, fond stewards. But the garden need not be tidy and tame; it can be beautifully, vitally “rambunctious.” The other idea is that spaces we think of as human, or ugly, or unnatural can be places of hidden nature—the wildflowers on the highway median, the cardinals in your backyard. We can cherish nature wherever it appears—even as a product of human intervention.

“The rambunctious garden is everywhere,” Marris writes. “Rambunctious gardening is proactive and optimistic; it creates more and more nature as it goes, rather than just building walls around the nature we have left.”

Marris, following some of the scientists she interviews, uses unorthodox criteria to evaluate natural spaces. For example, Hawaii is full of species that have been imported by humans, often displacing native species. The resulting areas are sometimes dubbed “trash ecosystems” by conservation biologists. But if the species are finding ways to thrive together—if the ecosystems are vital and lush and green—their human provenance doesn’t strike Marris as inherently problematic. In fact, these novel ecosystems can appear more vibrant—wilder—than the elder ecosystems that conservationists try valiantly to preserve. One paradox of our current situation is that due to climate change, the most “pristine” environments, the ones that most resemble their former states, are by necessity the ones most intensively managed by humans.

To some extent, the arguments here are academic—they’re between professional conservation experts. Most laypeople don’t need permission to enjoy the trees in their urban backyard or to find beauty in Queen Anne’s lace, an exotic wildflower. Few would even realize that those Hawaiian jungles were deemed deficient; indeed, that’s part of Marris’s point. Still, she’s right to identify a deep-seated and widespread feeling that if humans have influenced a landscape, it is not fully “natural.” According to that logic, very little, if any, nature remains, and as climate change continues to exert its pervasive effects, nowhere will be truly natural. Even if we humans manage to fend off civilizational collapse, that loss will be a tragedy.

Bill McKibben, now a leading climate activist, first articulated this view twenty-five years ago. In The End of Nature, his 1989 book that is considered the first about global warming for a popular audience, McKibben defined nature by “its separation from human society.” When he was hiking, the sound of a distant chainsaw in the woods could spoil his experience, because it would “drive away the feeling that you are in another, separate, timeless, wild sphere. . . . Now that we have changed the most basic forces around us, the noise of that chainsaw will always be in the woods.”

For Marris, this outlook is too purist. This conception of nature, she argues, was an idea created by humans in a specific context, and it can and should evolve. Once we shift our perspective, the possibility arises of making “more nature. We can make things on Earth better, not just less bad.”

Laura J. Martin, a doctoral candidate in Natural Resources at Cornell, conveys this idea succinctly in a Scientific American blog post. In place of the frequently invoked metaphor of a “footprint,” she suggests that we imagine our impact on the planet as a “handprint.” “A footprint is a mark one never meant to leave,” Martin writes. “A handprint, as opposed to a footprint, is deliberate, skilled and artful. It evokes human agency and the human ability to shape the world by choosing among many possible natures.”

To some staunch conservationists, this new ethos is perverse. If we can just redefine nature as we choose, what’s to keep us from our destructive path? Yet Marris expresses commitment to many of the same goals as mainstream environmentalists: protecting the rights of other species; slowing the rate of extinctions; and protecting the spiritual and aesthetic experience of nature. She is sometimes misinterpreted as giving up on wilderness areas, but her point is that even those areas are now influenced by humans (due to climate change, if not more direct intervention), and yet we can still enjoy them as nature.

Indeed, in the practical details, New Conservationists largely overlap with their more mainstream counterparts. The essential difference is making the “gestalt switch,” in Marris’s words, that allows us to see the whole environment in a new way, to see nature everywhere, to embrace our potential to make more of it.

But there’s a serious caveat: Marris’s book does not take into account the very real possibility of catastrophic climate change. It is addressed to the planet today, or to a recognizably similar version. If we are unable to forestall runaway warming, the rhetoric of handprints and gestalt switches will seem increasingly irrelevant.


How then can we avoid that fate? Another emerging group of eco-optimists claims to offer at least a partial answer.

While conservationism has historically advocated a hands-off approach to nature, another strand of environmentalism has been decidedly hands-on. Environmental heroes such as Wendell Berry and Michael Pollan have focused on agriculture: on how to make our interactions with the land sustainable and respectful. This second sub-movement grows out of that lineage but takes it to the next level, contending that we can not only reduce the negative ecological consequences of agriculture but turn it into a net positive.

This group argues that we’ve neglected a major opportunity to address the climate crisis. That forgotten factor is soil. To oversimplify a bit: carbon is bad in the air, but good in the soil. Global soils have lost a huge proportion of the carbon that fortified them; deforestation and tilling have released it into the atmosphere. Smart methods of farming and ranching can restore carbon to the soil through the process of photosynthesis. While agreeing that we must halt emissions, this contingent argues that by fully maximizing the soil’s potential for carbon sequestration, we can actually reverse global warming.

A recent report from the Rodale Institute claims that we could sequester “more than 100% of current annual CO2 emissions”—accommodating, that is, some of the increase in emissions projected for coming decades—by shifting to these practices, which they call “regenerative organic agriculture.” We can turn agriculture from a problem—currently responsible for three-quarters of global deforestation and about a quarter of total greenhouse gas emissions—into a solution. The specific techniques involve cover crops, residue mulching, composting, and crop rotation. These techniques substitute diversity for monoculture and always leave the soil covered, preventing the loss of carbon that results from bare soil.

The basic idea is far from fringe: no less of an authority than James Hansen has written that “improved agricultural practices can convert agriculture from a CO2 source into a CO2 sink.” He projects that such a shift, in conjunction with steep emissions reductions and reforestation, could get us back to 350 parts per million by the end of the century.

A more controversial element of the program involves raising livestock according to a method known as Holistic Planned Grazing, advocated by a Rhodesian-born septuagenarian named Allan Savory. Through their stimulation of plant growth, cows can help restore carbon to soil. In stark contrast to conventional environmental wisdom—which holds livestock responsible for a significant share of global greenhouse gas emissions and tells us to eat less meat—proponents of Holistic Planned Grazing argue that we should repopulate deserts with cattle. This idea has incurred strong disparagement from critics, including environmentalists wary of methane and of detracting attention from emissions.

Unresolved questions remain about the effectiveness of different methods and the magnitude of the possible results. But philosophically, the movement’s underlying challenge to green orthodoxies dovetails with that of the New Conservationists: as Michael Pollan, a prominent convert, has written, this soil-focused agricultural ethos “asks us to reconsider our pessimism about the human engagement with the rest of nature. The bedrock of that pessimism is our assumption that human transactions with nature are necessarily zero-sum: for us to wrest whatever we need or want from nature—food, energy, pleasure—means nature must be diminished. . . . Yet there are counterexamples that point to a way out of that dismal math . . .”

The core tenet of ecology is that everything is connected. Usually this is a threat. If one species goes extinct, say, the other species that depend on it also suffer. The feedback loops threatened by climate change are particularly scary: if rising global temperatures cause the permafrost to melt, tons of methane will be released, thereby exacerbating the warming, and so on.

But the corollary is the possibility of virtuous circles. Specifically, various kinds of “regenerative” agriculture can purportedly sequester carbon, make land more resistant to both drought and flood, and render soil much more conducive to growing nutritious plants. The notion that “everything is connected” becomes a source of optimism.

The emphasis on carbon sinks in some ways smacks of geo-engineering. It promises to offset warming, to counteract our emissions, to absolve our sins. But unlike schemes to dump iron into the sea or shield us from the sun with mirrors, changing agricultural practices has a low risk of large-scale unintended consequences. Rather than add another layer of techno-intervention, proponents seek to harness the power of natural processes. They are proposing a more bottom-up model, which ties in with efforts to promote food sovereignty and resist agribusiness.

But this approach does pose one of the risks of geo-engineering: that is, fostering a sense of complacency, implying that we don’t need to worry about emissions because we can just suck them back into the ground. This is particularly dangerous if, as some detractors believe, the movement’s claims about the potential for sequestration are inflated. And whatever its theoretical potential may be, there are still formidable political obstacles to the rapid and complete overhaul of global food systems.

Still, the agri-optimists are wise to join the New Conservationists in proposing a new perspective. They make a compelling case that humans, rather than being either destructive or protective, can be “regenerative”; that we can go beyond stewardship to engage with the environment in a mutually beneficial way.

Of course, humans have always claimed to improve on nature: we take the earth’s raw materials and transform them in ways that are useful or pleasing to us. That is arguably the defining activity of humankind, which separates us from other species. Our heedlessness in that activity is precisely what environmentalists have rebelled against.

But what these new breeds of environmentalists seem to be saying is different. They want to respect and facilitate the earth’s nonhuman systems, rather than conquer or disrupt them. The two groups, for example, seem to share an overriding faith in photosynthesis and our ability to encourage it. Through our agricultural practices, we can add carbon to the soil, making it rich and loamy, possibly even creating new soil. We can foster nature in the middle of the city, giving weeds and birds and insects pockets in which to thrive. Instead of expecting nature to be our handmaiden, we can strive to be nature’s handmaidens, learning as much as possible about how its systems work and seeking to augment them.

When I interviewed Seth Itzkan, an activist at an organization called Biodiversity for a Livable Climate, he invoked a concept from science fiction: terraforming. It means altering a planet to make it like Earth or to make it conducive to life. By restoring depleted ecosystems, he told me, “We can terraform planet Earth.”


In the context of social movements, it’s more common to declare optimism than pessimism; it’s a necessity to attract adherents and maintain momentum. (I suspect this motive played a role in the recent assertions of hope from Al Gore and Naomi Klein.) In this sense, optimism can be self-fulfilling. Optimism has been called a moral imperative; I doubt pessimism has been characterized as such.

And yet, in the face of current realities, expressions of optimism can sound either ignorant or callous. You’re unaware of the facts or willfully blind to them—or worse, you’re condoning the outrages. The eagerness to put a positive spin on current conditions has offended a number of environmentalists. Last summer, a group including Andy Revkin held a discussion on the concept of the “good Anthropocene.” (The Anthropocene is an unofficial name for the current geological era, in which humans are the dominant force of change.) Clive Hamilton, the Australian author of books such as Requiem for a Species, assailed them for even entertaining the idea. (He also called out Marris.) The suggestion that human transformation of the planet’s most elemental systems could have an upside was insensitive, Hamilton pointed out, to the millions of people in poor countries who will be devastated by climate change. “In the long term,” he wrote, “this kind of thinking will prove more insidious than climate science denial. . . . [G]rasping at delusions like ‘the good Anthropocene’ is a failure of courage, courage to face the facts.”

There is certainly a risk of being too sanguine. What is often missing from the rhetoric of the eco-optimists is grief. We can and must grieve for all those who have been and will be affected by the senseless disasters brought on by climate change. We can grieve for what we’ve lost—the possibility of pristine wilderness; of the regular, grounding rhythms of the natural world; of a planet where we’re the children of nature rather than (at our best) the guardians.

But grief is not the same as despair or pessimism. Crafting an affirmative vision of the future is essential: a future that is not likely, but at least conceivable; not perfect, but worth striving for. Seeing a way to get there is equally essential. Both are difficult in any social movement, but perhaps especially so given the historical context of environmentalism and the daunting scale of climate disruption. Environmental optimism is fraught with dangers. Abandoning hope is more so.

Fearful Parenting Is Contagious

Posted on: November 7th, 2014 by Rebecca Tuhus-Dubrow

After my daughter was born, whenever I heard about parents who refused vaccines, I’d feel a flare of hostility. Not because I couldn’t relate to them—as an easily spooked new mom, I could relate all too well. No mother is thrilled to see a needle jabbed into her child. It hardly helps to know that the needle contains a substance derived from a disease-causing agent. Even leaving aside the debunked autism claims, the visceral reality of vaccination runs counter to every parental instinct.

But I had decided to trust the experts and not think about it too much. My daughter’s blue immunization book was fully up to date. Hearing about parents who opted out reminded me of my unease. Their existence was also an implicit rebuke; thinking of them put me on the defensive. They would, I imagined, deem me a bad mother, negligent and misinformed. I all but wanted to shout, “I know you are but what am I” at my hypothetical anti-vaxxer adversaries.

In On Immunity, Eula Biss’s quietly impassioned new book, the author evinces no such hostility (and considerably more maturity). She does attribute one pitch-perfect line to her father, a doctor who serves as the wry voice of reason in the book. Biss is groping for words to explain the phenomenon of chicken pox parties as alternatives to vaccinations. “I say, ‘Some people want their children to get chicken pox because,’ and pause to think of the best reason to give a doctor. ‘They’re idiots,’ my father supplies.”

Biss goes on to write, “I do not think they are idiots. But I do think they may be indulging in a variety of preindustrial nostalgia that I too find seductive.” This conclusion—tactful, discerning, self-implicating—is characteristic of her book as a whole. In her understated way, she ends up building a case against the anti-vaccination movement that is more damning than either her father’s or mine.

But the book is about much more than shots and mumps. It contains elements of memoir—Biss relates her son’s unexpectedly difficult birth, after which she lost two liters of blood and received an emergency transfusion from generous strangers. For the most part, though, it is meditative rather than narrative: paragraphs slip from science to philosophy to Greek myth to vampires. At only 200-odd short pages, On Immunity probes a slew of big ideas, from the fiction of purity to the failure of government. All feed into the fundamental question: how to be a humane, non-insane parent circa 2014.

In lesser hands, tackling so many themes could result in a mess. But Biss is able to pull it off, thanks to her intellectual poise and her lucid, frequently aphoristic prose. “A trust—in the sense of a valuable asset placed in the care of someone to whom it does not ultimately belong—captures, more or less, my understanding of what it is to have a child.” “Those of us who draw on collective immunity owe our health to our neighbors.” “As with other strongly held beliefs, our fears are dear to us.” “We do not know alone.”

Biss, an acclaimed Chicago-based essayist, probably shares certain traits with many of her readers: she is a highly educated, married, middle-class mother. As she notes, her demographic is the one most likely to voluntarily forgo childhood vaccinations. Though the book’s tone is far from that of a manifesto, it is intended in part to persuade her audience to vaccinate. Though unlikely to convert Jenny McCarthy disciples, it very well could win over people who have heard vague rumors and are unsure what to think.

Biss is not only reassuring her audience that vaccination is safe; she’s arguing that it is a moral imperative. Vaccines are at the crux of her inquiry because they epitomize the way our personal decisions can either help or harm each other. She exposes the unsavory element in the anti-vaccine movement, identifying a feeling in some quarters of being somehow exempt—call it too posh for shots.

While pregnant, Biss visits a doctor who assures her that the hepatitis B vaccine is not something that “people like me needed to worry about,” but rather for the children of drug users and prostitutes. Later, Biss’s research reveals that, for reasons that are poorly understood, limiting vaccinations to at-risk populations did not curb the hep B epidemic; only mass vaccination did. “The belief that public health measures are not intended for people like us is widely held by many people like me,” she writes.

The most egregious embodiment of this view is Robert Seares, or Dr. Bob, a famous physician who functions as the book’s villain. “Vaccines don’t cause autism,” he writes, “except when they do.” The hep B vaccine “is an important vaccine from a public health standpoint, but it is not as critical from an individual point of view.” Biss’s deadpan retort: “In order for this to make sense, one must believe that individuals are not part of the public.” And then, Dr. Bob’s most stunning advice: though he indulges his patients’ fears about the measles, mumps, rubella (MMR) vaccine, he writes, “I also warn them not to share their fears with their neighbors, because if too many people avoid the MMR, we’ll likely see the disease increase significantly.” One of Dr. Bob’s patients, it turns out, was a child who sparked a measles outbreak in 2008.

Anti-vaccine hysteria has its roots in a cluster of impulses, and Biss explores how those impulses play out more broadly. One is an ethos that reflexively prizes the “natural,” which, Biss notes, could only arise because of our lack of intimacy with the natural world. Among today’s mothers, suspicion of the “unnatural” often takes the form of fears of toxic chemicals. Tens of thousands of unregulated industrial chemicals are on the market—in pesticides, in plastic, in pots and pans, in furniture. In this context, for a mother, the world can become a strange kind of minefield. The most ordinary household products suddenly take on a sinister cast. During pregnancy, I started apprehensively scrutinizing shampoo labels, some of which boasted that they didn’t contain ingredients I hadn’t known I was supposed to be worried about.

This feeling is dramatized at the beginning of one chapter, when Biss calls her husband in tears, crying that they need a new mattress, because of the toxic chemicals that may be lurking in their son’s crib. As Biss implies, such gestures are attempts to exert control over an uncontrollable world. They are superstitious; you tell yourself that if you follow certain rules—buy organic produce, wooden toys, glass bottles—you can keep your child safe and pure.

As our exposure to industrial chemicals has increased, so has our exposure to information, some accurate, some not so much. Biss discusses the falsehoods that circulate online: a Salon article on vaccines was corrected, then removed, and yet the original version remained on other sites. Though misinformation is surely a major problem, the more interesting question, to me, is whether we also suffer from an overload of accurate information. Hyper-awareness of hazards can be its own kind of toxin.

Biss concludes, “We are all already polluted . . . . We are, in other words, continuous with everything here on earth. Including, and especially, each other.” This realization can be liberating. But what exactly does it imply? Should we give up even trying to shelter our children from threats in the environment? If we recognize, as Biss does, that it is a luxury “to feel threatened by the invisible,” does that mean that fretting about BPA is an elitist indulgence? Not exactly. But trying to cordon off our children from the world around them is both ethically dubious and ultimately futile. The only viable response is to try to repair that world.

• • •

When Biss calls her husband in tears about the mattress, the Deepwater Horizon oil spill is underway. “‘If our government,’ I cried to my husband, ‘can’t keep phthalates out of my baby’s bedroom and parabens out of his lotion, and 210 million gallons of crude oil and 1.84 million gallons of dispersant out of the Gulf of Mexico, for the love of God, then what is it good for?’” This bitter disappointment in government is another component of the fear that defines contemporary American parenthood.

It is not just that we can’t count on our government to protect us; nor can we expect it to support us. Though Biss doesn’t delve into this dimension, the much-lamented lack of policies to support children and families—subsidized child care, paid family leave, and so on—all contribute to a sense among parents that we are on our own. The middle class doesn’t need these policies so desperately, but their absence fosters a climate in which the norm is to fend for ourselves, eroding the very idea of a public.

Compared with previous generations, fewer of us can look to a higher authority, either. Biss invokes God figuratively in her outcry to her husband, but otherwise He doesn’t come up much. In this way, too, we secular liberals are on our own. We can’t expect God to protect our children; equally important, we can’t take solace from the conviction that what does happen is God’s will. Whatever happens is on us. Biss demonstrates this burden when she learns that her son has allergies; she asks the doctor what she had done to cause the allergies. “The possibility that I was not to blame did not initially occur to me,” she writes.

For all of these reasons, many Americans feel the emotional burdens of parenthood acutely. One of my most haunting fears is that in trying to protect my child, I’ll harm her: that her sunscreen, say, will turn out to be toxic. Resistance to vaccination is a dramatic manifestation of this kind of fear. We are terrified of making the wrong move, and nothing could be more painful than the belief that you did something to hurt your child. The fear is related to the notion that the natural is preferable to the unnatural. The natural just happens, which makes it easier to accept; the unnatural requires human intervention. What if that intervention goes awry?

And yet our children are in fact safer and healthier than ever by most measures, suggesting that we’ve gotten more right than wrong (and reminding us that our government, despite its flaws, has in fact accomplished a great deal). In her research, Biss discovers that one out of every ten children born in 1900 died less than a year after birth. “I would read this in a report on vaccine side effects, which concluded its brief historical overview with the observation that now ‘children are expected to survive to adulthood,’” Biss writes drily.

To be a parent today in the United States, Biss writes, is to be in a position of “empowered powerlessness.” Here she is quoting the anthropologist Emily Martin, who describes “the paradox of feeling responsible for everything and powerless at the same time.” Biss comes to the conclusion, “As mothers, we must somehow square our power with our powerlessness.”

Parenthood might also be described as “selfish selflessness.” For the first time, probably, you care about someone else more than yourself, are willing to make sacrifices you’d never consider for anyone else. But the object of this generosity is not random—it is a person who is seen as a kind of extension of you. And too often, we seem to feel that this self-abnegation absolves us of responsibility to others. “Can we fault parents for putting their own child’s health ahead of that of the kids around him?” Dr. Bob asks. “This is meant to be a rhetorical question,” Biss writes, but Dr. Bob’s implied answer is “not mine.”

Biss, by her account, has friends who chose not to vaccinate their children. Writing this book, then, strikes me as a courageous act. It is not easy to criticize the parenting choices of our friends. But parenting culture, now so crippled by fear, needs to change, and it can. Biss’s book is, in part, an attempt to effect that change. And any parent can contribute by behaving in the way her book suggests. Precisely because we are so sensitive to judgment about our parenting choices, and because we are so bewildered, we tend to follow social cues, often for the worse but sometimes for the better. Not only viruses, after all, are contagious.

Endgame?

Posted on: July 11th, 2014 by Rebecca Tuhus-Dubrow

If a single book has haunted the environmental movement, it’s The Population Bomb, by Stanford biologist Paul Ehrlich. Published in 1968 by Ballantine, the work is remembered for a handful of striking passages: its opening description of seething crowds in Delhi; its prediction that in the 1970s hundreds of millions of people would succumb to famine; its endorsement of policies, such as taxes on childbearing, that have, to say the least, gone out of style.

The sensationalism of the book’s argument was modest compared to its marketing. Gracing the cover of the paperback edition was an image of a bomb with a burning fuse and the tagline “Population Control or Race to Oblivion?” Another line added, “While you are reading these words four people will have died from starvation. Most of them children.” The book sold 2 million copies in two years. Ehrlich became a celebrity speaker and a frequent guest on popular television shows.

Ever since the famines failed to arrive on schedule, the book has been attacked with glee by conservatives and held to epitomize environmentalism’s folly. For their part, Ehrlich and his wife, Anne (who co-authored the book without attribution), stand by their conviction that population growth is wreaking horrific damage, and they take credit for raising awareness about the planet’s limited resources. They have a point: their book is, on the whole, more measured than its notorious bits and screaming cover would suggest. And the final chapter is titled “What If I’m Wrong?” Among environmentalists, the book has been not so much renounced as met with a sort of embarrassed silence—at least until recently. Environmental writer Alan Weisman, in his new book on population, Countdown, fervently defends the Ehrlichs, insisting that the Green Revolution merely bought us some time. Still, The Population Bomb is not counted as a classic along the lines of Rachel Carson’s Silent Spring. It surely didn’t help that Ehrlich later made and lost a high-profile bet with economist Julian Simon about the future price of commodities and generally remained a pugnacious public figure. By contrast, shortly after the publication of her book, Carson died.

The ambiguous legacy of The Population Bomb points to a larger issue: What’s the most compelling way to tell stories about threats to the environment? Does apocalyptic language ultimately do the environmentalist cause more harm than good, undermining the credibility of the warnings? Does it alienate readers by demanding that they think about an unbearable future? Or, by garnering more attention than mild-mannered writers, do doomsayers succeed in spurring essential conversation? Environmental writers face a host of choices: to invoke self-interest or moral responsibility; to elicit hope or sow fear and sorrow; to dwell on problems or solutions.

In A Climate of Crisis, a fascinating intellectual history of American environmentalism, Emory University historian Patrick Allitt discusses The Population Bomb and many other environmental texts. Though his account is fair-minded, it is book-ended by an argument that “the mood of crisis that surrounded a succession of environmental fears was usually disproportionate to the actual danger involved.” Our society, Allitt contends, has proved quite capable of addressing environmental problems. He highlights in particular the landmark legislation of the 1970s and the consequent, underappreciated “great cleaning” of America’s air and water; and he criticizes environmentalists for persisting in their rhetoric of doom rather than celebrating these triumphs. But Allitt’s very argument reveals another possibility: the progress he chronicles occurred not despite but in part because of the mood of crisis. Could major environmental legislation ever have passed without a pressing sense of urgency? If you warn loudly of potential disaster—whether regarding Y2K or air pollution—the price of success is to seem alarmist in retrospect. Allitt acknowledges that “the anticipation of catastrophe can often contribute to preventing it,” but he restricts that lesson to the case of nuclear weapons while downplaying the risks of climate change.

Writing about global warming and the associated ecological emergencies brings distinctive challenges. Different audiences disagree sharply on the facts. A recent survey conducted by the Yale Project on Climate Change Communication found that 23 percent of respondents believe that global warming is not happening. The project divided Americans into six groups with respect to climate change: the Alarmed (16 percent); the Concerned (27 percent); the Cautious (23 percent); the Disengaged (5 percent); the Doubtful (12 percent); and the Dismissive (15 percent). Rhetoric that will galvanize the Alarmed stands little chance of engaging the Disengaged or converting the Dismissive. Should writers choose an audience and tailor their work accordingly? How can language be exquisitely fine-tuned to prompt the desired response—to steer a course between despair and complacency?

Words alone will never halt a hurricane or stay the rising seas. But few would deny that the way we communicate about issues matters. Notwithstanding some encouraging developments, notably the Environmental Protection Agency’s new rules curbing emissions from coal-fired plants, it’s fair to say that climate-change polemicists have so far failed to achieve their goals. How might they promulgate their messages more effectively?

* * *

In the standard environmentalist worldview, humans—and especially American consumers—are destroying the earth, which is equal parts deity and victim. This view is always going to antagonize a lot of people, who see it as preachy, misanthropic and joyless. Annalee Newitz, a science writer and proud member of Homo sapiens, takes a different approach. Her book Scatter, Adapt, and Remember is a primer for long-term human survival, spinning a sci-fi vision of the future. It downplays human culpability and the earth’s indispensability, espousing instead a can-do optimism oriented toward pragmatic problem-solving.

Newitz begins with an overview of our planet’s turbulent history, putting current realities and forecasts in context. Though our climate is changing and we are probably in the midst of a mass extinction—-with scores of species vanishing daily—-Newitz portrays these events as far from novel. She delivers a time-lapse narrative of planetary metamorphosis: continental plates smashing into each other, forming mountains and spilling carbon dioxide into the sea; blue-green algae emerging and beginning to release oxygen into the atmosphere; the climate lurching between “greenhouse” and “icehouse”; species dying out in massive numbers and then, slowly, new life repopulating the earth.

Newitz explains that a frequent culprit in mass extinction is climate change, and that minor shifts can trigger a cascade of effects that can quickly tumble into catastrophe. This pattern is at once alarming—-it confirms the warnings of contemporary climate scientists about our own possible future—-and somehow reassuring. It casts today’s strange weather as unexceptional in a natural, cyclical process. Newitz fully acknowledges the human role in the current warming and stresses the need to reduce greenhouse gas emissions. But her larger point is that if human societies hadn’t fouled the environment and altered the climate, some other force would—-will—-eventually threaten our survival anyway.

This fatalism is not bleak. On the contrary, there’s something liberating about the idea of ineluctable global catastrophe—to know that the asteroid will strike, the sun will explode, the supervolcano will erupt. Only when we think we can avert it—-by driving less, installing solar panels, buying local, growing basil on the roof, attending protest marches, chaining ourselves to coal plants, getting arrested, inundating our elected representatives with phone calls—-do stress and guilt set in.

“We have ample evidence that Earth is headed for disaster, and for the first time in history we have the ability to prevent that disaster from wiping us out,” Newitz writes. “Whether the disaster is caused by humans or by nature, it is inevitable. But our doom is not.” She explains that major asteroid strikes are expected to occur roughly every 100,000 years, which means “we are long overdue for another one.” She gives the impression not that we are destroying ourselves, but rather that we have gotten away with something. To Newitz, humans are exceptional not for the destruction we’ve caused—-after all, climate change and mass extinction have numerous precedents—-but for our unique ability to ultimately outfox the forces of global calamity.

Newitz proposes ideas to both mitigate human damage to the environment and prepare for the eventuality of catastrophe. Her suggestions range from the familiar green-agenda items (urban agriculture) to promising but unrealized techno-fixes (algae as an energy source and carbon suck) to more controversial proposals such as geo-engineering. She also advances some goals that are, depending on your perspective, either visionary or outlandish: building underground cities and colonizing other planets.

Newitz’s approach—-her eschewal of dogma, her sunny confidence-—might engage those who are put off by guilt trips and sermons. Indeed, a recent social science study found that raising the possibility of geo-engineering with conservatives seemed to “offset cultural polarization” and made the study’s subjects more concerned about climate change. (The advisability of geo-engineering is another question.) Not everyone, though, will share her enthusiasm for a future of riding space elevators and uploading our brains into computer software. “Don’t worry,” she concludes on her book’s final page. “As long as we keep exploring, humanity is going to survive.”

* * *

For the first Earth Day, in 1970, the cartoonist Walt Kelly produced an iconic poster that declared, “We have met the enemy and he is us.” It’s a longstanding green sentiment, revived today by the New Yorker staff writer Elizabeth Kolbert. At the end of her book The Sixth Extinction, Kolbert quotes the upbeat send-off of Newitz’s book (identifying the source only in an endnote) and then offers a curt retort: “at the risk of sounding anti-human—-some of my best friends are humans!—-I will say that [human survival] is not, in the end, what’s most worth attending to.”

Newitz and Kolbert cover a striking amount of the same material, from the history of mass extinctions to the relationship of Homo sapiens with the Neanderthals. But for two advocates on the same fundamental side of the climate debate, their tones and values could hardly be more antithetical. Where Newitz praises humanity’s ability to scatter and adapt, Kolbert trains her focus on the countless victims of that enterprise.

Thoroughly researched and elegantly written, with the occasional touch of dry humor, The Sixth Extinction is an ambitious addition to Kolbert’s oeuvre. Her previous volume, Field Notes From a Catastrophe, published after serialization in The New Yorker, brought the perils of global warming to the attention of a large audience. In her new book, Kolbert tags along with scientists from all over the world and delivers dispatches on their findings. The news-—whether about the fate of coral reefs, frogs or bats—-is not good. She leaves no doubt about this, frequently ending sections with portentous quotes. “The extinction scenario,” one scientist tells her, “well, it starts to look apocalyptic.” At the end of another section, she quotes the journal Oceanography to the effect that if we continue along our current path, it is likely to lead to “one of the most notable, if not cataclysmic, events in the history of our planet.”

Both Kolbert and Newitz address the question of what makes our species special, and they more or less agree on the answer: the drive to explore and the ability to alter our surroundings. But rather than see this as cause for celebration or awe, Kolbert views it above all as a pox on our fellow earthlings. Her work epitomizes one classic subgenre of environmentalist writing: the catalog of human crimes against nature. In fact, compared with other members of this subgenre, such as Silent Spring and Bill McKibben’s The End of Nature, The Sixth Extinction is on the extreme end of pessimism, declining to prescribe or even exhort. The philosophical thrust is to debunk any romantic notion of an Edenic past in which noble savages frolicked peaceably among the flora and fauna. For example, Kolbert notes an ascendant theory that holds humans responsible for the extinction of megafauna such as mammoths millennia ago—an annihilation that led to a string of other ecological changes. “Though it might be nice to imagine there once was a time when man lived in harmony with nature,” she writes, “it’s not clear that he ever really did.”

To Kolbert, humankind is essentially destructive. “With the capacity to represent the world in signs and symbols comes the capacity to change it, which, as it happens, is also the capacity to destroy it,” she notes, continuing:

To argue that the current extinction event could be averted if people just cared more and were willing to make more sacrifices is not wrong, exactly; still, it misses the point. It doesn’t much matter whether people care or don’t care. What matters is that people change the world…. If you want to think about why humans are so dangerous to other species, you can picture a poacher in Africa carrying an AK-47 or a logger in the Amazon gripping an ax, or, better still, you can picture yourself, holding a book on your lap.

With this last phrase, Kolbert implicates both the reader and herself and reveals the tension at the heart of her work. She is devoted to documenting environmental devastation, clearly convinced that this effort has value; but her inquiry has led her to the conclusion that even her own project-—which, after all, involved plane travel and tree pulp-—is part of the problem.

All of this is true, as far as it goes. But as rhetoric, it has its limitations: this very fatalism, in a sense, lets humans off too easily. If we are destined to destroy as the dark side of our creativity, why bother trying to avert catastrophe? If we are an innately destructive force, trying to save the planet from ourselves is more pointless than trying to intercept an asteroid.

* * *

Longtime environmental writer and activist Bill McKibben is likely sympathetic to aspects of Kolbert’s worldview. Her despondency about human violence to the planet resonates with much of his work. But McKibben has recently hit upon a new way of framing the issue of climate change: demonizing a subset of bad guys-—that is, identifying a “them.” In a 2012 Rolling Stone article called “Global Warming’s Terrifying New Math,” McKibben wrote that the fossil-fuel industry is planning to extract and burn more than five times as much carbon as the scientific consensus deems safe. “We have met the enemy and they is Shell,” he wrote, in a significant twist on that famous quote. (He presumably knows the original context of the quote, though his readers may not.) In his next move, he would target the enemy not just in word but deed: a campaign, centered on college campuses, to divest from fossil-fuel companies, modeled on the campaign to divest from apartheid South Africa in the 1970s and 1980s. In a pivotal victory for the young movement, Stanford University announced plans in May to divest from coal companies.

McKibben chronicles the development of this strategy in his book Oil and Honey. As he tells it, his current role is “unlikely” because at heart he is a writer, happiest in his beloved Vermont, alternating between his desk and the woods. But his overwhelmingly keen awareness of the climate emergency has, he writes, forced him to become the reluctant leader of a nascent movement-—constantly on the road, giving speeches, sitting on panels with members of Congress. It’s too soon to say whether he will achieve his goals, but he deserves credit for helping to create a new, adversarial dynamic. If the enemy is us, only a small minority of people will ever join the fight.

What does it mean for McKibben to transform himself from a writer into an activist? As a dichotomy, it’s somewhat misleading: McKibben’s writing has always had an activist bent, and his current activism involves a great deal of writing, including this new book. As he recalls here of his first book, The End of Nature, “my initial theory (I was still in my twenties) was that people would read the book—-and then change.” But as that theory proved increasingly untenable, he was compelled to think hard and long about how words, in conjunction with actions, could produce the impact he sought.

His writing often falls these days into the genre of the exhortatory tweet (“Half a million emails is a lot. I don’t know if we can do it. But we’re sure as hell going to try”). He also exhorts himself (“Back to work. On message”) or engages, to his chagrin, in some calculated posturing with political types: “they say something, we say something back, they push, we push…. It ran counter to every instinct of a writer, which is simply to say what’s true.” He is constantly communicating with a variety of constituencies, with specific intentions: to persuade, inspire or bluff, depending on his interlocutor. On one of his whirlwind tours, he spends an afternoon with the poet Gary Snyder and writes, “For an afternoon—-and it was the greatest present he could possibly have given me—-I felt like a writer again, the thing I most wanted to be and at least for the moment really couldn’t.”

In this conception, being a writer means dwelling on the sounds and textures of words, not on their utility; meandering in the eternal, not obsessing over the latest news cycle in Washington; savoring complexity, not dividing the world into good and evil. It means the primacy of curiosity, of irreverence; really, it means allegiance to no cause. McKibben, as an activist, needs to privilege the instrumental over the poetic, rhetoric over subtlety. He aches for his old, less strictly activist role, but he is drawn to make this sacrifice—-and one of the instrumental purposes of this book is to inspire its readers to make sacrifices of their own.

* * *

Curiosity, subtlety, nuance—-these are casualties of the polarized debate about climate change. The acknowledgment of uncertainty becomes ammunition for the so-called climate skeptics. But Craig Childs, although deeply concerned about anthropogenic climate change, is refreshingly indifferent to eco-etiquette. His book Apocalyptic Planet has an ingenious premise, just shy of being a gimmick: he visits a series of extreme climate locations, each of which represents a possible future for our planet, depending on how climate change and other forces evolve. He begins in the Mexican desert mid-drought, then ventures to the melting glaciers of Patagonia, the monoculture of an Iowa cornfield and so on. His ominous, lyrical chapter titles follow a pattern: “Deserts Consume,” “Ice Collapses,” “Mountains Move,” “Seas Boil.”

Childs is arguably the Ryszard Kapuscinski of environmental writing, with his daredevil adventures taking him to Arctic glaciers and treacherous rapids. He intersperses his personal narrative with history and reporting, and some of his observations might make an activist like McKibben bristle. He quotes Konrad Steffen, a prominent climate scientist with whom he travels to Greenland: “If we’ve done anything, we’ve stopped the next glacial period from happening by warming the earth.” He reports, “We do not live in a particularly impressive period in history for watching sea levels rise.” These statements may be valid, but they are not “on message.” He also gives space to foils, such as his friend Angus, a former Jehovah’s Witness who accompanies him through the “biotic dearth” of an Iowa cornfield and muses that perhaps the earth needs periodic mass extinctions to rest. (Childs takes this notion seriously and presents it to E.O. Wilson, who dismisses it.)

Above all, what sets him apart from other environmental writers is his curiosity. More than dread or hope, he seems to have a burning eagerness simply to find out what might happen: “I wondered if I could trade my own decades for a two-hundred-year life span just to see what page turns next for the earth.” He craves intimate sensory experience of our “twitching, restive planet,” submitting to dry desert heat, walking on barely dried lava, touching glaciers and hearing the explosive sounds of their collapse.

His descriptions of these experiences are evocative. In the desert: “I had sand in every part of me. My molars wouldn’t touch.” On watching melting ice: “Each teardrop shimmered for a moment and vibrated tenuously, then fell. This is how climate works, I thought. Forces push and pull, weather begins to switch back and forth, summer and winter turned upside down, and then the system jumps. The drip falls.”

His humans are neither villains nor heroes. They play a relatively minor role in his account, as do other species. It’s the earth that looms largest in Childs’s consciousness, more agent than victim. As he realized after living through an earthquake, “Humans may have a big hand in carpeting the atmosphere with heat-trapping gases and dumping every toxin we can imagine into waterways, but when the earth decides to roll, it is no longer our game.” He writes about the planet we inhabit with awestruck deference. Even while lamenting our losses, he wants to take them in up close. As Childs asks in this passage about his trip to Patagonia: “Saving the world? You can always hope. But to be alive in the last geologic moments of ice, wouldn’t you come and put your hands against it?”

Perhaps all environmental writers lie on some continuum from activist to author, their books somewhere between pamphlets and poems. They dream of saving the world; they know that the likeliest outcome is failure. But they write anyway—-to bear witness, to make sense of what is happening, to say what’s true.

Back in Patagonia, Childs writes this of the last moments of ice: “As it tinkled and cracked in the sun, I snapped off a tab and crunched it in my mouth. It turned to water instantly, as if it had been waiting a hundred centuries for this moment.”

Attitude Adjustments

Posted on: January 28th, 2014 by Rebecca Tuhus-Dubrow

In Promise Land, Jessica Lamb-Shapiro recounts her efforts to conquer one of her multiple phobias by attending a support group called Freedom to Fly. The group’s course, led by a psychologist, met at the Westchester airport and culminated in a round-trip flight to Boston. Lamb-Shapiro secretly had no intention of boarding the flight, but she ultimately mustered the nerve, thanks in part to peer pressure and the charismatic leader. The decisive influence, however, was chemical rather than social. “I had often wondered if taking a pill would prevent me from thinking I was about to die on a plane or prevent me from caring,” she writes. “It was the latter.”

This qualified victory marks one of Lamb-Shapiro’s more successful forays into the world of instructional workshops and inspirational guidance. “I wanted to know why people liked self-help so much, what it meant to them, whether it worked,” she writes at the book’s outset, “and if it didn’t work, why people still craved it.” Her quest takes her to a wide and motley array of destinations: a conference on writing self-help books, headlined by the cowriter of Chicken Soup for the Soul; a seminar by a coauthor of The Rules, the retrograde ’90s guide to husband snaring; and a New Agey camp where she joins teenagers in walking on hot coals. Most chapters are anchored by Lamb-Shapiro’s first-person account of a self-help excursion, framed by cultural history as well as the author’s tragedy-tinged autobiography.

Promise Land is very much a book of the publishing zeitgeist—the gimmicky premise, the mash-up of genres—and risks coming off as clichéd. But Lamb-Shapiro’s authorial presence rescues it from that fate. Her approach to the material is skeptical but not cynical; her personal disclosures feel generous rather than exhibitionistic; and she writes in a mordant, deadpan voice with impeccable economy and timing.

She presents wry dispatches from the assorted subcultures she explores, and in her telling they range from the gauzily mystical to the shamelessly mercenary. “We don’t get burnt. We get kissed,” says the adult who initiates nervous teens into the ritual of fire walking. “Don’t hop all funny. Don’t run. Anybody with a stroke or who has difficulty walking please do not do this. Let’s sing.” After attending the Chicken Soup conference, Lamb-Shapiro received e-mails for years from the cocreator, Mark Victor Hansen. “Very few topics are not within his purview,” she reports. “‘Jessica, are you a healthepreneur?’ reads one subject line.”

Lamb-Shapiro also delivers some trenchant appraisals of the self-help industry’s allure. Here, for instance, is how she evokes the promise of the turn-your-life-around best seller:

Buying a book can make you feel better because it makes you feel like you are in control. I have started, it says. . . . Most of the self-help books I bought for research were secondhand, and were heavily underlined and annotated for the first twenty pages.

Unspooling through the book is also the thread of a memoir, centered on the long-ago suicide of Lamb-Shapiro’s mother. Her account is mostly matter of fact and free of self-pity, which imbues the handful of nakedly emotional moments she describes with that much more poignancy. Looking at an apparently happy picture of herself and her mother on the beach, “the photo takes on an ominous cast. Sometimes I look at that picture and I can’t help thinking, You stupid baby. You don’t even know what’s coming for you. You’d better wise up.”

Still, for all the book’s pleasures, Lamb-Shapiro’s analysis, like her shifting narrative focus, sometimes feels scattered. Promise Land also suffers from the author’s failure to convincingly define her subject. Self-help in her reckoning includes such wildly divergent phenomena as the harsh injunctions of The Rules; a “vision board” purported to turn wishes into reality; a camp that helps kids cope with grief; and a sign in her local auto-body shop reading, “We create our tomorrows by what we dream today.” Self-help seems, in short, to be everywhere and nowhere—a concept so broadly construed as to be nearly meaningless.

Of course, Lamb-Shapiro is far from alone in this imprecision. Latter-day versions of the self-improvement gospel have spiraled out in two directions. On one hand, the term refers to a certain belief system of cheerful self-reinvention and the conquest of psychological traumas—-call it the Oprah ethos. On the other, it refers to any material offering personal guidance of any sort. The seeming ubiquity of self-help may owe as much to this semantic creep as to the triumph of a coherent category.

The term itself hovers around a telling ambiguity. Is the self of the prefix primarily the provider of the help in question, or its beneficiary? Some critics, assuming the former, lament an intensely individualist American culture of self-reliance, which tends to sidestep or deride support from communities or government. Others, assuming that the self under renovation is primarily the recipient of therapeutic help, deplore our rampant consumerist obsession with self-improvement.

But much of what we think of as self-help is steeped in a different sort of American tradition, reflecting neither narcissism nor rugged autonomy. This is what Lamb-Shapiro perceptively calls “the pleasures of the imperative.” And this feature, I think, goes a long way toward explaining both the appeal of self-help and the scorn it incurs. Being told what to do offers deep solace even as it infantilizes. Lamb-Shapiro notices these effects coming to the fore at the conference for aspiring self-help writers, but the same principle applies to many of the other events she attends and books she reads. “I had been told what to do, wear, and eat for the last two hours, and it brought me a kind of comfort,” she writes. “It was no accident, I thought, that the last time I had this little responsibility for my actions I had braces on my teeth, a bona fide mullet, and someone to cook me dinner every night.”

Cli-Fi

Posted on: July 7th, 2013 by Rebecca Tuhus-Dubrow

Makepeace Hatfield, the heroine of Marcel Theroux’s 2009 novel Far North, is one of the last survivors of a Siberian settlement. Her father was an early settler: an American Quaker who fled a decadent world for a frontiersman’s life. In the Siberian summer, he discovered fertile terrain, purple and brown, and water that “heaved with salmon,” as Makepeace recalls. “Nothing I’ve known in the Far North resembles the land of ice that people expected him to find here.”

We are in the future, or, at any rate, a future. The settlement has collapsed under the pressure of an influx of starving refugees. Makepeace—a stoic, androgynous woman—forges her own bullets and hunts wild pigs. When she witnesses the crash of a small plane, she sets out on her horse to find the rump of civilization that must have produced it. She is welcomed into a small religious community, then imprisoned at a work camp, and eventually makes her way to a dead metropolis.

Far North, hailed by the Washington Post as the “first great cautionary fable of climate change,” is one of the strongest of a recent crop of similar books, most of which are also post-apocalyptic or dystopian. But the novel is no straightforward eco-parable. Indeed, at one point, Theroux seems to have a little fun with green pieties. In the book, knowledge about the origins of the crisis is fuzzy, but Makepeace’s learned confidant offers an explanation:

Shamsudin said the planet had heated up. They turned off smokestacks and stopped flying….Factories were shut down….As it turned out, the smoke from all the furnaces had been working like a sunshade, keeping the world a few degrees cooler than it would have been otherwise. He said that in trying to do the right thing, we had sawed off the branch we were sitting on.

The novel also underscores the ironies of considering global warming an “environmentalist” cause. Today, conservationists rue human ubiquity and the loss of wilderness. Theroux reminds us that what’s finally at stake in our experiment with the climate is human achievement. In Makepeace’s time, wilderness is reclaiming cities; it’s the accumulated knowledge of millennia that verges on extinction. Aviation, for instance: “[T]o turn words and numbers into metal and make them fly—what bigger miracle can there be?” Makepeace marvels. “It’s a kind of heresy to say so, but I think our race has made forms more beautiful than what was here before us.”

As an emblem of climate change, nothing could be more apt than an airplane. One of our most triumphant inventions, it is also a prime belcher of the gases that are overheating the atmosphere. In Far North, the mourned losses—planes, civilization as a whole—were also the source of the trouble that ultimately all but destroyed both. Our species was too smart, or not quite smart enough, for its own good.

…..

Novelists were slow to take up the subject of global warming. In 2005, Robert Macfarlane wrote an article in the Guardian lamenting the dearth of art addressing the issue. “Where are the novels, the plays, the poems, the songs, the libretti, of this massive contemporary anxiety?…[A]n imaginative repertoire is urgently needed by which the causes and consequences of climate change can be debated, sensed, and communicated.”

In the past few years, that imaginative repertoire has begun to emerge. Perhaps climate change had once seemed too large-scale, or too abstract, for the minutely human landscape of fiction. But the threat seems to have become too pressing to ignore, and less abstract, thanks to a nonstop succession of mega-storms and record-shattering temperatures. In addition to Theroux, major novelists including Margaret Atwood, Ian McEwan, and Jennifer Egan have published books that touch on climate change. A 2011 anthology, I’m with the Bears: Short Stories from a Damaged Planet, collected short fiction by celebrated writers such as Helen Simpson and David Mitchell. This year, several new novels make climate change central to their plot and setting: Back to the Garden, by Canadian writer Clara Hume; The Healer, by Finnish suspense novelist Antti Tuomainen; and Odds Against Tomorrow, by Nathaniel Rich, one of the few Americans to contribute to the genre.

Most of the authors seek, at least in part, to warn, translating graphs and scientific jargon into experience and emotion. In Back to the Garden, a small group of close friends—Fran, a short-haired beauty; Elena and her partner, Daniel; and the couple’s two children—live on a mountainside in Idaho. As in Far North, the characters have reverted to a primitive way of life; they hunt with bows and arrows and weave their own clothes. Soon Leo, a former movie star, drifts onto the mountain. The friends set out on a road trip to find family, in a journey that’s part standard post-apocalyptic narrative and part Wizard of Oz. Along the way, in the lawless country, they encounter armed thugs, but also kind strangers who join their growing entourage. Finally, they return to the mountain from which they’d departed. A sliver of hope is represented by Fran’s pregnancy.

Back to the Garden is warm-hearted and unabashedly didactic. The promotional material says the novel “presents a frightening and tragic possibility for our future but doesn’t ignore our affirmative connection to nature and other people. The novel attempts to open people’s eyes to the importance of respecting limits, before it’s too late.” When we learn that Leo has lost his brother to dengue fever, the cause is no mystery. As Leo explains, “Warmer temperatures meant mosquitoes began to survive winters, tropical insects migrated northward, and public health and sanitation went away; these things, and polluted water sources, had resulted in increased diseases transported by mosquitoes and other insects.”

…..

While Hume uses fiction to bring climate change to life, other writers use climate change to bring their fiction to life. Antti Tuomainen’s The Healer unfolds in a dystopian Helsinki in which the effects of global warming—constant rain, climate refugees, social breakdown—amplify the noir atmosphere. The plot centers on the narrator’s search for his missing wife, a journalist last known to be investigating a killer who calls himself “the Healer.” The Healer targets the corporate executives (and their families) whom he holds responsible for the climate crisis. “He said he did it on behalf of ordinary people, to avenge them,” reports the narrator, “and said he was the last voice of truth in a world headed toward destruction—a healer for a sick planet.”

In this gloomy future, there’s no single catastrophe, but rather unremittingly worsening conditions. The characters talk about the “old world” with a resigned nostalgia. More than the other books, this one makes clear how the consequences of climate change will be distributed: that is, perversely. The affluent (disproportionate CO2 emitters) have fled north to more bearable temperatures, while the poor squat in the abandoned houses of the rich. (The Healer’s attacks on his wealthy victims are a pathological attempt to right this wrong.) Tuomainen may be motivated to caution readers against this projected future, but he has also seized on it as a promising setting for a cinema-ready suspense novel.

Similarly, Nathaniel Rich’s Odds Against Tomorrow uses climate change for its own literary purposes, in this case as a kind of metaphor. The book explores the risk and fragility intrinsic to life—and the anxiety that feels particularly acute in information-saturated, godless postmodern life. Climatic disruptions are one menace, but far from the only one. The novel opens with an earthquake that totals Seattle. The hero, Mitchell Zukor, is a neurotic fellow obsessed with calculating the odds of every imaginable disaster, from volcanic eruptions to elevator accidents. Early on, we meet young Elsa Bruner, who’s afflicted by an obscure heart disorder known as Brugada syndrome. Though otherwise healthy, she could die at any moment. Mitchell cannot fathom how she lives with this knowledge. But of course, we all live with essentially the same knowledge. So do we, like Elsa, move to a farm and enjoy life? Or do we, like Mitchell, analyze every action for its statistical threat?

The novel begins to refer in passing to the scorching temperatures in New York and culminates in a cataclysmic storm, presumably intensified by climate change. In the aftermath, Mitchell learns to act, rather than fret, as he starts anew in the Flatlands neighborhood of Brooklyn. He lives alone, planting snow peas, kohlrabi, and burdock. “The long stretches of quiet labor narcotized him,” writes Rich. “He’d be away from the world, yet in it more intimately than he had ever known. Doing: finally.” The disaster has happened, and it is liberating.

Most climate-change fiction is set, for obvious reasons, in the future. But Ian McEwan’s underrated 2010 novel Solar takes place in the present. The anti-hero, Michael Beard, is a Nobel Prize–winning physicist. He is an absurd figure: an overweight, short, balding philanderer. Though not malevolent, he has no control over his appetites. He coasts on his renown—holding honorary posts, giving speeches, lending his name to institutes—while only sex and food retain his interest. For his latest sinecure, he serves as the nominal head of England’s new National Center for Renewable Energy. Beard has no real interest in the issue, but through a series of farcical events, he ends up stealing the ideas of an underling. (This young man is having an affair with Beard’s wife; when Beard catches him in a bathrobe, the man slips on a bear rug and dies.) The ideas concern a new approach to solar energy, and Beard uses them to shift his focus to entrepreneurship—to save the planet and make his fortune at once.

While Beard embarks on his solar-energy scheme, his personal behavior is both literally contributing to the problem (he flies constantly and has become so fat that, on land, only an SUV can comfortably accommodate him) and symbolic of the lack of discipline that has put us in our predicament. As an extremely intelligent creature who exploits those around him and is ultimately self-destructive, he seems to stand in for the entire human species.

…..

The fundamental story of climate change is simple. Human behavior provoked a change in the weather, unleashing, among other effects, dangerous storms. This story should sound familiar. It’s one of the oldest narratives in the human repository. The tale of Noah’s ark is just one variation on the ancient flood myth, in which a deity annihilates the human race for its sins.

Of course, to primitive people, fierce weather must have demanded explanation, and human wickedness supplied a readily available answer. Their more enlightened descendants knew better. They identified other causes: pressure systems and cold fronts and the like. They knew that human actions could not influence the weather.

But now we know even better. The details diverge from the classical apocalyptic narrative. God as the intermediary is absent. And the casualties are not necessarily the culpable. Indeed, as The Healer highlights, the opposite will often be true. But at heart, the story sounds like a myth, which makes it both resonant and hard to believe.

Climate change, and the emerging fiction addressing it, recalls other myths as well. The plane that is so central to Far North evokes Icarus, who flew too close to the sun and was punished with death for his hubris. The Garden of Eden also comes to mind. Our ancestors were born into a climatic Eden: the Holocene, a geological age uncommonly hospitable to human life. Now we have entered what some scientists are calling the “Anthropocene,” a less congenial epoch in which human activity is the most influential force on earth. The products of our knowledge have evicted us from our climatic paradise—what several of these novels refer to nostalgically as the “old world.”

In his Guardian piece, Macfarlane wrote that the literature of climate change should not be apocalyptic, both because such a scenario was not true to the current science and because environmentalists had been burned before by crying apocalypse. Instead, he wrote, such fiction

might require literary languages which are attentive to the creep of change; which practise a vigilance of attention and a precision of utterance (one thinks back to Thoreau, recording the day each year on which Walden Pond first froze, or of Ruskin, in his home on the shores of Coniston, making painstaking daily measurements of the blueness of the sky, to check the effects of air-pollution upon its colour, or of Gilbert White ascertaining the different keys in which owls of different woods hooted).

So far, though, much of the new fiction has gravitated toward catastrophes and end times. And, indeed, the examples Macfarlane cites as models are not novelists but naturalists and critics. In his introduction to I’m With the Bears, the anthology of environmentally themed short stories, Bill McKibben writes that the artist’s role is “to help us understand what things feel like.” But the novel may not be the most appropriate form to convey what climate change, in its subtler, everyday manifestations, feels like now. (McEwan’s book, the only one set in the present, is a satire; however accomplished, it has no emotional resonance.) The experience of seeing the climate gradually change and knowing we are implicated—collectively and in some small way individually—is not especially conducive to dramatic plot.

The writers who have captured the feeling best are nonfiction authors who simply describe it. For example, McKibben himself, back in 1989, in The End of Nature, wrote of his anguish: “I love winter best now, but I try not to love it too much, for fear of the January perhaps not so distant when the snow will fall as warm rain.” Jon Mooallem, in his new book, Wild Ones, watches a video of the final starvation shudders of a polar bear cub in its altered habitat. Witnessing this death “felt incriminating in the most paradoxical way: I felt unsettled by how much power our species is wielding on the planet, and I also felt powerless.”

But the novels discussed here do something valuable, too. They refashion myths for our age, appropriating time-honored narratives to accord with our knowledge and our fears. Climate change is unprecedented and extraordinary, forcing us to rethink our place in the world. At the same time, in looking at its causes and its repercussions, we find old themes. There have always been disasters; there has always been loss; there has always been change. The novels, as all novels must, both grapple with the particulars of their setting and use these particulars to illuminate enduring truths of the human condition. Like Michael Beard, we’ve always screwed up; and like Mitchell Zukor, we’ve always been afraid. And in the end, as ever, we survive the storm, or drown.

I Change, You Change

Posted on: January 21st, 2013 by Rebecca Tuhus-Dubrow

About halfway through her new memoir, “Data, a Love Story,” Amy Webb pauses to address the reader. Up to this point, the author’s online hunt for a husband has yielded little but farcically bad dates. In frustration she begins an analysis involving scatter plots and word clouds to discern the laws of success in online dating. “I want to reveal what I found,” she tells us, “so that you can improve your own dating profile.” (Spoiler alert: showing skin is a plus; lengthy “About Me” sections are a turnoff.) We then follow Webb as she uses her discoveries to lure Mr. Right. And, presumably, we close the book with a sense of how to do the same.

We often read autobiography to glimpse a life unlike our own. Ulysses S. Grant allows us to witness scenes from the Civil War; Patti Smith invites us into her glamorously gritty 1970s New York. A recent memoir cliché is the survivor’s tale of abuse or addiction, vying for maximum woe. We can’t or wouldn’t want to have all these experiences ourselves.

But it’s time to christen a new subgenre: the self-help memoir, a kind of long-form personal narrative fused with life coaching. I change, you change. For the authors of these books, the selling point is not that their challenges are exceptional, but that they are common. Like us, the authors are just trying to find true love or raise good kids or enjoy life more. Nobody is claiming to have braved captivity in the Colombian jungle.

They do, however, claim to bring something special to readers — some wisdom gleaned through happenstance or research — that equips them (and by extension, us) to meet life’s ordinary challenges. Their authority stems not from academic credentials, like the Ph.D.’s and M.D.’s flaunted on the covers of standard self-help books, but from personal experience. The resulting books are how-to guides written in the first person rather than the second, in the past tense rather than the imperative.

Blending autobiography with advice is not entirely new. Authors as diverse as Benjamin Franklin and Helen Gurley Brown have already done it, brilliantly. But such books could now fill an entire Barnes & Noble shelf — and most of the authors strenuously try to keep out of the (perceived second-tier) self-help section. In many cases, the self-help memoirists have actually acquired their experience in order to write about it, a reversal of sorts from the traditional memoir. (This approach also has a name its practitioners may not embrace: “stunt nonfiction.”) The resulting books cater at once to our mania for self-improvement and our gluttonous appetite for first-person narrative.

Consider the queen of the self-help memoir, Gretchen Rubin. Last fall, she published “Happier at Home,” a sequel to her 2009 best seller “The Happiness Project.” Each chronicles Rubin’s methodical efforts to inject her life with cheer. Choosing a different focus each month — marriage, parenthood, work and so on — Rubin makes resolutions like “fight right” and “tackle a nagging task.” We accompany her on a dalliance with acupuncture and on Wednesday excursions with her daughter. She urges readers to follow her example; both books include a chapter called “Your Happiness Project.”

Other authors are less overtly prescriptive. For example, in “No Cheating, No Dying” Elizabeth Weil documents her marriage improvement project, and in “Drop Dead Healthy,” A. J. Jacobs describes his program to become “maximally healthy.” The reader’s self-interest is implied rather than spelled out.

But all three writers share certain distinguishing characteristics. None had dramatic background stories. Rubin was already pretty content, Weil’s marriage was solid, Jacobs simply preferred the couch to the gym. What sets them apart is their improvement designs. They consulted the pertinent studies, carried out experiments and took notes. Weil, for instance, can report that such strategies as “skilled conversation” (taught in a marriage education class) and novel adventures (swimming with her husband from Alcatraz to San Francisco) invigorated her marriage. Though she never counsels her readers, surely most will think on their own relationships while peering in on hers, and perhaps plan their own versions of a daunting athletic escapade.

The self-help memoir hybrid boasts certain clear advantages over its alternatives. Conventional self-help books are embarrassing. Better to be seen on the subway with “No Cheating, No Dying” than “Why Marriages Succeed or Fail: And How You Can Make Yours Last,” by John Gottman, Ph.D. Not only does the latter broadcast your vulnerabilities; it may also offend your sensibilities. Some readers would no sooner pick up self-help than a Harlequin romance, even if both hold content they would appreciate in less stigmatized forms.

The first-person perspective also grants the author credibility, albeit in a different way than those advanced degrees. Like the Amazon reviewers we consult rather than Consumer Reports, like our parenting listservs and favorite cooking blogs, self-help memoirists submit the most persuasive testimony: firsthand reports from people who’ve been there. What’s more, for a writer without a big name or a scandalous past, the promise of guidance excuses the indulgence of autobiography.

Meanwhile, for readers, these multitasking books offer efficiency, combining semi-literary pleasure with practical usefulness. (By the same token, as others have pointed out, many popular science books have a distinctly self-help bent, explaining research that enables readers to become happier, more successful, more creative.) Most important, a personal narrative is a lot more fun to read than a bulleted list of tips and exercises.

The self-help memoir may owe something to a backlash against what Ben Yagoda, in his 2009 book “Memoir: A History,” called “misery memoirs.” (Think of James Frey and Augusten Burroughs.) Critics and readers alike grew weary of the exhibitionism and occasional fraudulence of books like “A Million Little Pieces.” Whereas these hell-and-back accounts traded in total candor about dark secrets, the revelations in self-help memoirs are limited. To varying degrees they mete out information about the authors and their families, but only to the extent that it’s relevant to the subject at hand. And author is subordinate to subject here, whether it’s the quest for happiness or health or a more magnetic online dating profile.

The result is sometimes what may seem an oxymoron: the discreet memoir. Jacobs, for example, writes in “Drop Dead Healthy,” “Julie doesn’t want me saying exactly how often we have sex.” (He does, however, disclose he has hemorrhoids.) Thanks to such tact, self-help memoirs are less likely to incur wrath and lawsuits. Even Burroughs, who was sued by the family depicted in his first book, has changed his ways. His most recent book, “This Is How,” purports to offer “help for the self.”

According to Yagoda, autobiographers in the 18th and 19th centuries routinely began with a preface to justify the seemingly egomaniacal enterprise. They often alluded to Horace’s dictum about poetry — that it ought to delight and instruct. Certainly we all hope to learn something when we read about other lives — whether to enrich our understanding of the world, to find role models or to put our own minor problems in perspective. The difference today is that the lessons of self-help memoirs are more explicit, and more grounded in everyday life. Horace would not likely have envisioned instructions quite as mundane as “tackle a nagging task.”

The journey from wretchedness to redemption is one of the most common narrative arcs in memoir, from St. Augustine onward. But rather than redemption, the self-help memoir culminates in improvement — fitting for a culture in which the most fashionable addictions are lattes and ellipticals. The self-help memoirist goes from suboptimal to systematically upgraded. And so can you.

Natural Woman

Posted on: May 16th, 2012 by Rebecca Tuhus-Dubrow

Breast milk is freighted with more symbolic weight than any beverage should have to bear. The stuff signals femininity, fertility, and naturalness. It is thought to possess quasi-magical properties: to protect babies from infection, to enrich their brains, and to bestow them with devastating good looks. (Okay, that last one hasn’t been studied yet, but watch out for a future issue of JAMA.) To a certain subset of feminist, breast milk represents the awe-inspiring abilities—-to sustain life, to nurture—-that distinguish the female sex.

To another kind of feminist, the substance is considerably less worthy of reverence. It embodies the duties that biology has foisted on women, keeping them from more interesting pursuits. Simone de Beauvoir, for instance, viewed breastfeeding as “exhausting servitude.” For this kind of feminist, scientific advances, such as formula, can free women from the shackles of the nursery.

The philosophical dispute over nursing encapsulates a larger debate about how women should live: Embrace a distinctively female set of values, focused on care and connection? Or strive to join men on their turf?

Elisabeth Badinter is firmly in the second camp. A French philosophy professor, she has written a number of contrarian books about womanhood, including Mother Love: Myth and Reality (1981), which challenges assumptions about the maternal instinct. In 2010 Le Conflit: La Femme et La Mère, her most recent salvo, became a bestseller in France. She is considered one of that country’s most influential intellectuals and has also achieved an unusual degree of celebrity in the United States, with profiles in the New York Times and the New Yorker.

Now an American version of Le Conflit has been released, with the title The Conflict: How Modern Motherhood Undermines the Status of Women. In the book, Badinter lambastes the return of motherhood to the center of women’s lives, a shift she observes throughout the West. She examines a wide assortment of policy and cultural factors at play since the 1970s. But her chief culprit is an ideology she calls “naturalism,” the belief in the infallible wisdom of nature. She sees naturalism at the heart of breastfeeding absolutism, as well as other trends, such as un-medicated childbirth and cloth diapers. In the name of nature, all of these deprive women of conveniences that could ease the burdens of motherhood. “Nature has become a decisive argument for imposing laws or dispensing advice,” she writes. “It is now an ethical touchstone, hard to criticize and overwhelming all other considerations.”

Badinter is right to call out the excesses of contemporary motherhood in some quarters. She is also right that knee-jerk allegiance to the “natural” warrants interrogation. But too often, she fails to persuasively defend the particulars of her argument, and her rejection of ambiguity hurts her case. At least in an American context, Badinter’s charges inflate the importance of her favorite targets while under-emphasizing some of the guiltiest parties.

• • •

Badinter traces the advent of naturalism to a backlash against modernity, sparked by the 1973 fuel crisis. She sketches the emergence, in the 1970s and ’80s, of three schools of thought, each of which glorified the natural and which together produced an ethos investing nature with moral authority. Ecology held that instead of mastering nature for human purposes, we must defer to it. Behavioral science claimed to vindicate old beliefs about the maternal instinct, based on the study of mammals and hormones. Finally, “essentialist feminism” stressed the differences between the sexes rather than their similarities, celebrating menstruation, childbirth, and other experiences unique to womanhood. Badinter’s definition of naturalism, then, yokes together quite distinct meanings of “nature,” uniting environmentalism with belief in innate gender differences.

Her criticism of breastfeeding dogma fits this argument best. All of the factions singled out above are aligned. In behavioral science, the well-regarded anthropologist Sarah Blaffer Hrdy, among others, has praised nursing for triggering in mothers virtually orgasmic hormonal releases that forge crucial bonds with babies. Many feminists, chafing at over-medicalization, embraced breastfeeding along with natural childbirth. And Badinter cites a more recent environmentalist rationale: breastfeeding saves water, plastic, and other resources needed for formula.

Perhaps most relevant to mothers, the most esteemed medical institutions, such as the World Health Organization (WHO) and the American Academy of Pediatrics, are also on board. The WHO recommends exclusive breastfeeding for the first six months, followed by the introduction of solid foods while “breastfeeding continues for up to two years of age or beyond.”

But a sober look at the data reveals a more nuanced picture of breast milk than its most diehard champions allow. Some of the purported virtues are confirmed, Badinter reports, but others, such as the alleged cognitive benefits, are more debatable or have been debunked. She argues that considering improvements over time in recipes, formula now rivals breast milk. (American readers may already be familiar with these points from Hanna Rosin’s widely discussed 2009 Atlantic piece, “The Case Against Breast-Feeding.”)

Given the marginal benefits and the commitment required by nursing—-often likened to a full-time job—-the pressure to breastfeed can be unreasonably intense. In certain milieus, formula is regarded as hardly more appropriate for babies than vodka. Many women give up professional momentum to nurse, while others, without the luxury to cut back on work, resort to bottles and are haunted by feelings of guilt. This is to say nothing of adoptive parents, mothers whose bodies do not produce enough milk, or women who find nursing prohibitively uncomfortable or onerous.

But while Badinter is surely right that the “naturalness” of breastfeeding facilitates its consecration, the driving concern behind nursing zealotry, at least in the United States, is less for fidelity to nature than for children’s health and success. The prevailing rhetoric stresses that modern science has proved breast milk’s substantial boons for children. The maternal bonding is no doubt a plus, but given that women are encouraged to pump when they can’t offer a nipple, this aspect seems secondary. And it’s safe to assume that few mothers see breastfeeding principally as a green lifestyle choice, akin to driving a Prius or taking shorter showers.

Indeed, Badinter’s attempt to implicate environmentalism in modern motherhood is the most strained part of her argument. Consider her attack on the use of cloth diapers. Cloth diapers, Badinter writes, have become an all but compulsory choice for today’s progressive mother. Huggies and Pampers, after all, are to blame for the massacre of trees and prodigious landfill waste. The decisive strike against them: Greenpeace-sponsored tests found traces of dioxin in some disposable diapers. (She doesn’t explain what dioxin is, but she doesn’t need to; you can guess that you don’t want it next to your baby’s private parts.) Of course the cloth kind cannot be tossed and forgotten—that’s the point—but must be laundered again and again. Hence, “Yet another new task awaits the ecologically minded mother.”

Put aside the fact that, unlike breastfeeding, diaper laundry is not intrinsically a woman’s job; we’ll grant that women are disproportionately stuck with it. But Badinter exaggerates the phenomenon. At least in the United States, pressure to opt for cloth is nonexistent. According to the only statistic she invokes—-presumably the most impressive she could find—-20 percent of English babies “regularly or occasionally” sport cloth diapers. These numbers tell us very little, since they include occasional use and do not reveal the reasons behind the choice. She also declines to grapple with the issues at stake, insinuating, but not explicitly contending, that to care about all those dead trees is rather frivolous.

More damaging to her argument, she conflates anxiety about children’s health with a broader commitment to the environment. Judging by Badinter’s own facts and figures, a fairly small minority of parents use cloth diapers, and a good slice of them may be motivated largely by fear of exposing their babies to chemicals. Progressive mothers may well want to save the trees and the whales, but their environmentalist sympathies conflict with their parenting choices more often than they dovetail. Witness the profusion of baby gear in a typical nursery and the custom of chauffeuring children unceasingly to soccer practice and swim meets. All are part of an ethos that extends well beyond weaning and potty training, in which the child’s perceived interests dominate family life.

This single-minded focus on children’s health and flourishing leaves little room to think about the bigger picture. In a 1980 journal article, social critic Robert Crawford used the term “healthism” to refer to a new preoccupation of the middle class with personal health and wholesome lifestyles. He also drew a connection between healthism and political disengagement. A sense of impotence—-“I can’t change the world, but at least I can change myself,” as Crawford put it—-fed the mania for vitamins, exercise, herbal supplements. And in turn, as people poured more energy into their own health, they had less time and inclination to invest in civic or political involvement. Since 1980 this outlook does not seem to have abated, to say the least, and for parents it applies doubly to their children. In shaping contemporary parenthood, this retreat to the private sphere has been at least as important as a retreat to nature.

• • •

Badinter’s argument about the retrograde state of motherhood is meant to apply throughout the industrialized world. But, of course, there are profound differences across nations and classes, which she does discuss (the former much more than the latter). Ironically, the country in which her theory applies least is her own. For example, according to her statistics, as of 2010 just over half of French mothers breastfeed at birth, while about 15 percent do so at three or four months. At six months the numbers are so minimal that France is not even represented on the charts. By contrast 75 percent of American mothers try nursing, and 24 percent are still lactating a year later.

In the past few years, a spate of American writers has brought tidings of French parenthood as if sending awed dispatches from utopia. In her 2005 book Perfect Madness: Motherhood in the Age of Anxiety, Judith Warner reminisced about her time in Paris, where both of her daughters were born. She availed herself of an affordable, state-subsidized nanny; she was surrounded by relaxed mothers who guiltlessly worked and took time for themselves. When she returned to the United States, she found evenings dominated by Girl Scout meetings, marriages corroded by resentment, and promising careers derailed. Pamela Druckerman’s recent bestseller, Bringing Up Bébé: One American Mother Discovers the Wisdom of French Parenting, reports that French children sleep soundly, eat sophisticated cuisine, and behave politely to adults, among other miracles. Meanwhile, French mothers look fabulous and keep the spark alive in their marriages. The common theme of both books is that French mothers refuse to let their children rule their lives—and yet they are in some ways more effective parents than their American counterparts.

Policy and culture, inevitably intertwined, both play a role in these national differences. Warner and Druckerman marvel at the French government’s support for families, whether by implementing a sliding price scale for high-quality daycare or furnishing monthly cash grants to parents. Badinter recognizes the importance of policy, but she offers an additional explanation of the French anomaly. At least since the 1600s, she writes, French women have not identified primarily as mothers. Aristocratic women handed off their children to wet nurses and, later, governesses and boarding school. By the 1700s, all but the poorest families employed wet nurses. Childcare responsibilities were considered “embarrassing”; upper-class women were expected to prioritize their husbands and social lives. Rather than waiting on children, they dispensed witticisms at salons. With roots in this history, Badinter argues, French women today feel exempt from many of the demands and the self-reproach that torment women elsewhere. (Mercifully, though, they seem to be more attentive to children than their ancestors were.)

In France, according to Badinter’s account, a laid-back approach to motherhood confers prestige. In the United States, by contrast, well-off, highly educated women today subscribe to a cult of intensive motherhood. Like shopping at Crate and Barrel and knowing where to get artisanal cheese, obsessive mothering is a class marker.

What accounts for the rise of this model in the United States and (probably to a lesser extent) in some other countries? In addition to her main argument about naturalism, Badinter suggests that mothers today are reacting against the more distant parenting style many of their own feminist mothers practiced. She also advances what could be called the Whac-A-Mole hypothesis of sexism: as women have made gains in universities and boardrooms, oppression has reemerged in a different site. There is probably some truth to both theories. But whatever their origins, once norms of motherhood solidify, they are self-perpetuating; few stigmas are more severe than that of the bad mother.

Judith Warner provides another key explanation: in the increasingly insecure, winner-take-all landscape of the United States, parents are frantic to ensure their offspring’s place in the nation’s hierarchy. They feel intense pressure to give their children every ostensible advantage—from breast milk’s nutritive value to Harvard-worthy extracurricular résumés.

France’s government offers a measure of protection against this insecurity. The state’s relevance is not limited to what we think of as family law. Certainly the United States needs much better paid-leave policies and support for childcare. But we also live in a society of subpar public schools, gaping economic inequality, and under-regulated toxins, without guaranteed health care, in which people have lost faith in the political process. All-consuming motherhood is easy to mock—and sometimes richly deserving of mockery. But there are reasons parents feel anxious about their children’s health, safety, and prospects. Parents harbor little hope of changing the conditions that affect everyone’s children. They fixate instead on their own. The result is a vicious circle.

If there’s any bright spot—-and any counterpoint to France envy—-it’s American men. Badinter writes, “With the baby back to being exclusively the mother’s concern, the father is once again free to attend to his own affairs with a clear conscience.” That statement is patently false in much of the United States. As Liza Mundy’s new book, The Richer Sex, demonstrates, a growing number of American women are primary breadwinners while more dads are staying home with kids. Even in relatively conventional households, fathers spend more time with their children than their own fathers did, as well as more time on housework. The sight of a man pushing a stroller or folding the laundry is no longer exotic. We’re still far from an egalitarian ideal, but on these fronts we’re doing better than France.

That men are stepping up in the home is welcome news. But it doesn’t fully address the heart of the issue. To some extent, it means that the albatross of modern American motherhood is really the albatross of modern American parenthood. (Men now also buy more cosmetic products than in the past, but that kind of equality doesn’t necessarily represent progress.) The government has woefully underserved families, making parenting a more overwhelming job than it needs to be. Parents share some blame too: among many upper-middle-class families, legitimate concern about children’s welfare slides into a game of signaling status and fitting in. While anthropologists, feminists, and environmentalists may have helped inform current norms, the figures that really matter are more familiar: Uncle Sam and the Joneses.

Traveling Light

Posted on: November 1st, 2008 by Rebecca Tuhus-Dubrow

To the extent that the two are separable, Ryszard Kapuscinski is revered as much for his legendary persona as for his work. Until his death last January at seventy-four, the Polish journalist and badass embodied the glamour of the uncompromising foreign correspondent. For decades he rushed toward the places everyone else wanted to flee, to be there for the war, the coup, even the humdrum poverty. Rather than interviewing officials, his preferred research method was idling in bars with locals. And he recorded his impressions with acuity and lyricism, in books that are often said to transcend the limits of journalism. Kapuscinski himself espoused this view in a 1987 Granta interview, describing his genre as “a completely new field of literature,” whose subject was “what surrounds the story. The climate, the atmosphere of the street, the feeling of the people, the gossip of the town, the smell; the thousand, thousand elements of reality that are part of the event you read about in 600 words in your morning paper. . . . This is more than journalism.”

More, but perhaps also less. Critics charge that, due to a combination of sloppiness and fabulism, his books are littered with inaccuracies. Two well-respected reporters, John Ryle and Michela Wrong, have documented errors ranging from the customs of tribes to the character of a given town to the reading habits of Ethiopian despot Haile Selassie. Kapuscinski enthusiasts include unshaven backpackers and esteemed novelists; his detractors tend to be people who know the regions he covered, whether ordinary residents or foreign experts.

In the end, both assessments are right: Kapuscinski transcended and violated the limits of journalism, and is probably best read as a storyteller. The Emperor, his postmortem of Haile Selassie’s dictatorship, is, if not an entirely reliable guide to Ethiopian history, an incisive and mesmerizing study of how tyranny holds together and falls apart. It evokes the choreography of everyday life in the palace, through accounts of former servants, identified only by their initials, and their extravagantly specialized duties. (One source reports that for ten years, he wiped the urine of Haile Selassie’s dog, Lulu, from the feet of visiting dignitaries.) In addition to the manifestations of the emperor’s power, this slim masterpiece also examines his techniques for sustaining it, such as rewarding loyalty while punishing talent.

Several of Kapuscinski’s other books—The Soccer War, Imperium, The Shadow of the Sun—provide a series of dispatches rather than cohesive narratives. They are inconsistent—sometimes swaggering, sometimes tedious—but they frequently crackle with arresting images, kinetic prose, and surprising insights. Substantial stretches are devoted to ostensibly straightforward information about various parts of the world. Given Kapuscinski’s credibility problem—if enough facts are wrong, all of them are suspect—the resourceful reader will skim these sections and relish the anecdotes.

In The Soccer War, the author finds himself on the front lines of a war between El Salvador and Honduras, with a bewildered Honduran conscript. As they roam among corpses in the forest, this young man has an epiphany: he will take the boots of the dead, who no longer need them, for his unshod family. “He had already calculated that he could trade one pair of army boots for three pairs of children’s shoes, and there were nine little ones back home. . . . Now the war had meaning for him, a point of reference and a goal.” Ignoring Kapuscinski’s growing alarm as they linger on the battlefield, the soldier collects the boots and hides them behind a bush, to retrieve later. Whether or not this vignette is literally factual, it succinctly captures a true-to-life scenario. A funny, absurdist, dreadful story, it is classic Kapuscinski.

KAPUSCINSKI’S FINAL book, recently published in English as Travels with Herodotus, is a departure from his previous work. A memoir of youth, it contains very little that can be characterized as journalism. Rather than integrating his stories with reportage, he weaves them in with excerpts from The Histories of Herodotus. In fact, the book is in part an extended reflection on journalism, through Herodotus, whom he considers the first foreign correspondent and his role model. For the first time in his oeuvre (at least, as far as this linguistically limited reviewer can tell), Kapuscinski makes gestures to confide in his readers. Before, he always sought to tell us about some part of the world; now, he wants to say something about how to be in the world. In short, it seems like the work of a man aware of his approaching death.

As a young reporter, Kapuscinski recalls, he was filled with a craving to travel. Although his ambitions were modest—he dreamed only of crossing the Polish border—he was dispatched to India by his newspaper. Before the trip, his editor gave him a copy of Herodotus, which became his dog-eared companion as he collected more and more stamps on his passport.

The version of himself he presents here, unlike the intrepid adventurer of his earlier books, is an overwhelmed and earnest youngster from the wrong side of the Iron Curtain. On a stopover in Rome, en route to India, he musters the gumption to go out alone one night, wearing clothes his hosts have helped him to purchase. In lieu of his “à la Warsaw Pact 1956” getup—a yellow nylon shirt and green plaid tie—he sports “a new suit, an Italian shirt white as snow, and a most fashionable polka-dotted tie . . .” But he nevertheless senses the stares of the Italians, for “there must still have been something in my appearance and gestures, in my way of sitting and moving, that gave me away—betrayed where I came from, from how different a world.”

The source of his ineffable difference—his upbringing in a communist country—had also given him a dogmatic belief in equality, which he gently mocks. In India, he declines to hire a rickshaw driver: “[T]he very idea of sprawling comfortably in a rickshaw pulled by a hungry, weak waif of a man with one foot already in the grave filled me with the utmost revulsion, outrage, horror. To be an exploiter? A bloodsucker? . . . Never!” He pushes the desperate drivers away. “They were astounded—what was I saying, what was I doing? They had been counting on me, after all. I was their only chance, their only hope—if only for a bowl of rice.” But he walks on impassively, smugly proud of his refusal.

The book alternates between these endearing and well-spun reminiscences—after India, we follow him to China, then Egypt, Algeria, Senegal, and other destinations—and long passages quoting, summarizing, and reflecting on Herodotus. He was riveted by this volume, he tells us, sometimes more absorbed in its pages than in his own surroundings. Herodotus, born circa 484 B.C.E., left his home in ancient Greece to explore the world, to soak up maximal knowledge of other cultures, and then to communicate it. He took a special interest in wars and their causes. Kapuscinski, via Herodotus, delves into the rash and savage battles of ancient times. We are treated to plenty of creative maiming and killing, which might make us feel civilized by comparison, and to cautionary tales of hubris—of leaders recklessly taking their nations to doomed wars—which swiftly temper any confidence in progress.

KAPUSCINSKI APPLIES the same keen, imaginative attention to reading Herodotus as he always has to his own experiences. After a series of imperial victories, Cyrus, the Persian ruler, sets off across the desert for further conquests. Kapuscinski contemplates the luxuries enjoyed by the king, who, even on his journey, would drink water only from the River Choaspes. “I am fascinated by this water,” he writes. “Water that has been boiled ahead of time. Stored in silver vessels to keep it cool. One has to cross the desert freighted with those vessels.” As ever, he homes in on the exercise of power and the exploitation of the underlings. “How often do we consider the fact that the treasures and riches of the world were created from time immemorial by slaves? From the irrigation systems of Mesopotamia, the Great Wall(s) of China, the pyramids of Egypt, and acropolis of Athens . . .?”

Kapuscinski contemplates not only Herodotus’s book but also the ancient author himself. “What sent him into motion?” Kapuscinski wonders. “I think that it was simply curiosity about the world. The desire to be there to see it at any cost, to experience it no matter what. It is actually a seldom encountered passion,” he muses. Kapuscinski slyly treads a fine line between holding Herodotus up as a hero and implicitly naming himself the Greek’s modern-day successor.

Through such meditations, Kapuscinski appears to want to expound some lessons. He advocates childlike curiosity about the world, counseling readers to resist provincialism. Provincialism, he reminds us, can be a matter not only of space but also of time. One ought to explore the contemporary world through travel and the past through reading.

Pondering the art of reportage, he concludes that Herodotus must have listened and observed attentively, must have had a voluminous memory, unspoiled by Internet or even library access. He notes approvingly that Herodotus always made an effort to check his facts, scrupulously distinguishing between what he had witnessed personally and what he had heard from sources. Curiously, he never mentions the charges that Herodotus was, in addition to the “father of history,” the “father of lies”—accusations similar to those leveled at him. He addresses neither overtly, but at times one wonders whether he is indirectly defending both of them against their critics.

Although it never comes close to the heights of Kapuscinski’s best work, much of Travels with Herodotus is worth reading. The vignettes from his past are charming, and he treats Herodotus’s world with inquisitive intelligence. The chronologically provincial among us ought to be grateful to be dragged back to antiquity, where we wouldn’t ordinarily venture without a contemporary guide. But the wisdom he offers is bland and feel-good. Herodotus’s greatest discovery? “That there are many worlds. And that each is different. Each is important.”

LIKE THIS paean to multiculturalism, unsophisticated in the context of today’s debates about clashing values, his valentines to travel and reporting seem willfully blinkered. In Kapuscinski’s rendering, he and Herodotus were motivated strictly by curiosity and the wish to share the knowledge they accumulated. Ambition and personal glory are absent from the picture. So are flaws and failures. He concedes that, due to unavoidable limitations, neither he nor Herodotus was able to verify every last fact or to witness every event firsthand. But the fault never lies with the reporter; laziness, arrogance, and the desire to scoop rivals do not exist in his world. What’s more, being good at either reporting or literature—and Kapuscinski was, to perhaps inversely proportional degrees, good at both—also requires a sort of exploitation: people must be treated as sources or material. Kapuscinski neglects to grapple with these complexities.

Such realities afflict the calling in the best of circumstances. But Kapuscinski was from a repressive society, which made the costs steeper still. The Polish government recently revealed that Kapuscinski worked as an informant for the Communist secret police in exchange for his travel privileges, although he evidently provided them with nothing of value. How should we assess the bargain he struck? Unlike Gunther Grass, Kapuscinski did not attempt to control his public relations with a preemptive confession. But in his own examination of tyranny, he never passes judgment on citizens who are forced to diminish themselves to get by. After all, even we—Americans living under Bush/Cheney—are arguably implicated in the crimes of our government. Under any appalling regime, only the most self-sacrificing purists wash themselves of complicity. Think of Thoreau, who refused to pay taxes to a government that supported slavery, asking Emerson from behind bars, “The question is, Waldo, what are you doing out there?” That said, such tactics are not necessarily the most effective.

To my mind, only hypocrites and absolutists will condemn him. Which is not to say the revelation is off-limits in considerations of his work. Indeed, it furnishes some of the messy context that he chose to omit in Travels with Herodotus. The reports he wrote make a discomfiting foil for his other writing, and especially for the philosophy he espouses here. In the reports, rather than observing the world out of zeal for experience, he was reduced to collecting information for basely utilitarian purposes. Instead of writing to enlighten ordinary readers, he wrote for an elite intent on oppression. Even if, to his credit, he deliberately withheld damaging information, the arrangement was a compromise. But the reports were the price he paid for his published work, much of which indisputably evinces curiosity and sensitivity. Kapuscinski reminds us in this book that many of the world’s great wonders relied on slave labor; marvelous products often necessitate a behind-the-scenes process that’s less pretty.

In Travels with Herodotus, then, he hasn’t told the whole story. Even if it is more intimate than his other books, even if he has meted out disclosures of loneliness and vulnerability, he has ultimately maintained a strict reserve. He was not obliged to be confessional, of course. But he would have left us with a much richer last testament had he chosen to probe more deeply—if not into his personal life, then at least into the investigation of traveling and writing, the discomforts and concessions that go beyond squalid hotels.

At the end of the book, on a trip to the Greek island where Herodotus was born, Kapuscinski returns to his hotel, where a young Turkish girl is at the reception desk. “When she saw me, she adjusted her facial expression so that the professional smile meant to invite and tempt tourists was tempered by tradition’s injunction always to maintain a serious and indifferent mien toward a strange man.” This is the last sentence Kapuscinski leaves us with, a strikingly inconclusive yet fitting ending. A perception of a person, linked with an observation about a foreign culture. And a reference to himself—as he was so often in his travels, and in the end, even to his readers—more strange than known.