Archive for the ‘Article’ Category

The Leased Life

Posted on: January 24th, 2010 by Rebecca Tuhus-Dubrow

In June 2008, when Punsri Abeywickrema was working on his backyard in San Mateo, Calif., he found himself in need of a wheelbarrow. He didn’t own one, but his neighbor did, and he had borrowed it the previous weekend; due to space constraints, he preferred not to buy one himself. Yet he hesitated to impose on the neighbor again.

He ended up renting a wheelbarrow from a store. But then he wondered, what if he had instead offered to pay his neighbor a small fee to borrow the wheelbarrow? Abeywickrema would have fulfilled his need without acquiring a cumbersome object or feeling like a freeloader. The neighbor, meanwhile, could have reaped a modest windfall. This thought led to an inspiration – wouldn’t it be great if a whole network of residents in his area could conduct similar transactions, with locals they didn’t even know yet?

Thus was born Rentalic.com. Like so many other websites, it connects mutual beneficiaries – in this case, people who own things they don’t use much with people who want to use things without owning them. Members can post either belongings they have to offer or goods they are hoping to find. Items recently listed include a body fat scale ($5 a week), a bread maker ($1.75 a day), and a cupcake transporter ($3 a week). The site was launched last October with limited membership, and opened to the public earlier this month.

Rentalic is an example of what is sometimes called (rather awkwardly) a “product service system.” The essential insight is that in many purchases, we don’t want the thing per se – we want what it can do for us. You don’t crave a lawn mower, you want shorter grass; the desire is not for a refrigerator but for cold, unspoiled milk. And according to an emerging line of thinking, there are great benefits in meeting the customer’s needs in creative ways that don’t necessarily entail ownership.

The concept has long been familiar in certain sectors – if you’ve rented a car, joined a gym, or registered at Netflix, you’ve taken part in a product service system. But now, advocates hope to expand the principle to many other contexts where owning is currently the norm. If these systems were to catch on, we would rent, borrow, or lease a variety of things, ranging from tools to textbooks to snow blowers, from individual owners or from companies with revamped business models. Other firms, while continuing to sell their products, would address customers’ underlying needs more directly – selling warmth or “comfort services,” for instance, rather than oil or gas – which would presumably lead to enhanced efficiency.

In addition to Rentalic, a flurry of similar websites has recently cropped up, facilitating lending and renting within communities. Leasing has become increasingly common from one business to another, and several local governments are exploring bike-sharing initiatives. And in academic and policy circles, some analysts are studying whether businesses can shift on a large scale to offering services instead of trying to sell as much of their products as possible.

The prospect of such a change is intriguing. Not only could it mean more savings and less accumulation of stuff for the relatively wealthy; for poorer people, especially in developing countries, it could mean access to goods that would be otherwise unattainable. The other major promise is environmental. Under the current ownership ethos, manufacturers face perverse incentives: If their wares last a long time, they undermine their future marketing opportunities. But if a company retains ownership itself, the argument goes, it will be motivated to make products that are truly durable – or easy to upgrade or recycle into new ones. And if a single product serves multiple users, fewer goods would need to be manufactured and ultimately discarded. All of this would mean diminished resource extraction from the earth and less trash dumped into landfills.

“People have this mindset that being green is expensive,” says Abeywickrema, the Rentalic founder. “Electric cars and solar panels are not accessible to the majority of people. People don’t realize that reusing and sharing can really help the environment as well as helping people. If lots of people in society adapt to this kind of thing, where we share with each other, we could make a big difference in the environment very fast.”

That said, product service systems are not necessarily greener – for instance, if they involve a lot of travel for delivery, the net impact could be environmentally negative. Likewise, renting doesn’t always make financial sense; if you rent something enough times, of course, you’ll exceed the cost of buying it. To be viable and sustainable, these systems must be carefully crafted to add value for customers and to avoid environmental pitfalls.

But the biggest challenge may be cultural. Americans are accustomed to having many possessions – it is a sign of status and identity, and we take the convenience of constant access for granted. In the United States, notes Michael Braungart, one of the concept’s pioneers, “You don’t want to use something that someone else used before.”

Proponents of this idea tap into a longer line of alternative thinking about consumption: Environmentalists have always promoted reuse and deplored waste. But in the 1990s, Braungart, a German chemist, and William McDonough, an American architect, began to write more specifically about relevant business strategies. Though we tend to use the word “consumption” to refer to all purchases, Braungart and McDonough drew a distinction between “consumables” and “products of service.” Items in the former category – tomatoes, Chapstick – are used up by their owners; they are indeed consumed. The second category, though, consists of products that endure – cars, refrigerators, vacuum cleaners – and we buy them for the results they provide (transportation, preserved food, floors free of dust bunnies).

These products of service, they argued, should be leased by the manufacturers, who would ultimately recover them, and then either reuse or recycle them. This model would give companies more incentive to make high-quality goods and to minimize undesirable elements such as chemicals. Other ecological thinkers, such as Paul Hawken and Amory Lovins, latched onto this notion and developed it.

Around the same time, a business case for services was also emerging. In the early 1990s, management literature began to urge companies to incorporate services into their business plans. In part, since goods have become so easy to produce, services constitute a means of distinguishing a firm from its competitors. What’s more, services offer a more stable cash flow, because they are less susceptible to swings in the economy, and they often have higher profit margins. They also allow companies to cultivate stronger ties with customers.

“The whole idea is that providing a service, I will develop a relationship with you,” says Rogelio Oliva, a business professor at Texas A & M University. “If I have an ongoing relationship with you, I have a better understanding of your needs.”

In the past few years, several factors have converged to advance the idea of product service systems. Concern about the planet has risen, as the business world has continued to gravitate toward services. More recently, in the economic downturn, renting and borrowing have gained new currency.

The UN Environment Programme has begun to promote product service systems. It advocates government support, for example by using tax policy to favor services. Green blogs such as Treehugger and Worldchanging have enthusiastically embraced the concept, championing various examples of it. And some academics and designers are starting to think about ramifications for product design. A Swedish university, the Blekinge Institute of Technology, will launch a master’s program in Sustainable Product-Service System Innovation this fall.

So far, businesses have been especially receptive to leasing from other businesses, in so-called b-to-b transactions. GE leases medical equipment, Xerox leases machines, and Rolls Royce leases turbines, all to other companies. Businesses may be less attached to ownership than individuals are; indeed, lower capital investments are preferable, because they lead to higher return on assets, a metric of profitability.

Another variation of this idea does not renounce selling products, but redefines business goals so that selling more is not necessarily better. In a 2007 article in the MIT Sloan Management Review, “Sustainability through Servicizing,” Sandra Rothenberg, a business professor at the Rochester Institute of Technology, described several companies that helped their customers reduce consumption. For example, Gage, a chemical supplier, was working with Chrysler Corp. Gage assumed a role in making sure its materials were used correctly and efficiently, leading to a lower quantity of cleaning solvents sold. There are various ways to work out such arrangements. The two parties might decide on a flat fee for a given service, which gives the supplier incentive to maximize efficiency. Or they can resolve to cut consumption as much as possible, and split the savings between them.

“In my research on sustainability, I realized that there’s no way we can get there without dealing with consumption,” says Rothenberg. “Most theories don’t really adequately address this issue of consumption. They make technologies greener, but they don’t say, OK, let’s help you not buy this in the future.”

But there are limits to the approach. One of the most celebrated examples, a leasing service offered by the carpet company Interface, is, on closer inspection, in fact a cautionary tale. In the mid-1990s, the company’s CEO launched this service as part of a larger sustainability program. Recognizing his company’s large environmental footprint, he hoped to lease carpets to other companies, then reclaim the materials and turn them into new carpets. Interface framed the option as “floorcovering services,” including upkeep. But, while many customers were intrigued, they would almost always back out at the last moment, opting to buy instead. According to professor Oliva, who conducted a case study of the program for Harvard Business School, the business model simply didn’t make sense. It was cheaper for companies to buy the carpets and hire their local cleaning crews, who charged less for maintenance. And the companies were reluctant to commit to a contract that would curtail their options for the duration of the lease.

Oliva says, “The first question you have to ask when making this transition is: Is there a business model there? Can I make money?”

What about individual consumers, or “b-to-c” transactions? The Cambridge-based company Zipcar is a perfect example of a product service system that appeals to customers for economic, environmental, and lifestyle reasons. Members pay a small annual fee, guaranteeing them access to cars in their cities. They pay an additional hourly rate when using the cars. This arrangement attracts many urban, environmentally conscious residents who don’t want the hassle or carbon emissions of owning a car, but do want automobile access for an occasional day trip or shopping excursion. Other businesses have begun renting designer clothes and handbags – think of Jennifer Hudson’s character in the “Sex and the City” movie.

This idea has gained more traction in Europe than here. Braungart has worked with several companies in Europe to recast products such as windows and chairs as services. In the near future, he says, even shoes will be marketed this way. (In his version of the idea, companies would recycle their products rather than rent them out again – so from the customer’s perspective it’s not so different from owning for a defined period.) To the uninitiated, the terminology he uses can sound a little like a spoof. “You’re no longer selling the window, you’re selling the service of insulation and looking through,” he says. “You’re no longer selling a chair, you’re selling sitting insurance.” Within a couple of years, he promises, “We will have shoes on the market where you sell two years of feet transportation insurance.”

Then there’s the more DIY trend of individuals renting and lending to other individuals. Along with Rentalic, new websites devoted to this purpose include GoGoVerde, Neighborrow, Bright Neighbor, Wecommune, and NeighborGoods. Users are sharing books, DVDs, gadgets, and appliances, from mundane necessities to quirky curiosities.

Janelle Orsi, coauthor (with Emily Doskow) of the recent book “The Sharing Solution,” calls the websites “some of the best evidence that there’s a groundswell that people are interested in sharing. One of the barriers is the hassle of having to find somebody. We all love convenience. These websites are making it so easy and even giving financial incentives.”

The massive number of products made and thrown away has long posed environmental problems. But economists have warned that consumption is essential to prosperity. This new approach could offer an innovative way to resolve that conflict. Spending would still occur, but fewer physical resources would be required to meet the same needs.

Our economy has already shifted dramatically over the past few decades toward the service sector. This happened mostly thanks to forces in the global economy – specifically the availability of cheap manufacturing labor abroad. But once the change was well underway, Oliva says, business and academia began to observe the advantages of offering services. Increased “servicization” could represent the next logical step in this direction.

“If we can get people and companies more focused on usership rather than ownership, we’d be able to extend the life of a lot of products,” says Stephen W. Brown, executive director of the Center for Services Leadership at Arizona State University. “More and more consumers are going to realize that that’s what’s really important.”

The Power of Positive Deviants

Posted on: November 29th, 2009 by Rebecca Tuhus-Dubrow

In 2001, Muhammad Shafique arrived in the Haripur District in Pakistan, a region known for its traditionalism and wariness of outsiders. As part of a team from Save the Children, Shafique was seeking to improve outcomes for newborns. Immediate breast-feeding is recommended for babies, but infants in the region were typically fed ghutti, an herbal mixture, before nursing. By tradition, the umbilical cord was cut with a bamboo stick, which put babies at risk for tetanus. And fathers were seldom involved in pregnancy and delivery. These and other factors had contributed to high rates of infant illness and mortality.

In similar places, aid groups had tried to tackle this problem by training midwives or distributing information to mothers at health clinics. But Shafique and his colleagues took another approach. They talked at length with the villagers, and soon discovered that these customs, while prevalent, were not universal. A few mothers breast-fed their newborns before giving them ghutti, calling their milk “the first gift of God for my child.” One father had bought a clean razor blade, which he asked the attendant to use instead of a bamboo stick to cut the cord. Several men had taken steps to provide their pregnant wives with extra food and had saved money in case of emergency.

In general, the children of these parents were flourishing. At a series of well-attended meetings, the Save the Children staffers shared these success stories, describing how unorthodox practices within the community were producing healthy babies. Others began to adopt these strategies. Six months after Save the Children left, men from the villages reported that no newborns had died since the group’s departure.

According to Shafique, the husbands said to themselves, “If he can do it, why can’t I?…The solution to the problem lies within the community. There are people who have the same resources, but they’re doing something differently.”

This initiative is an example of “positive deviance,” an approach to behavioral and social change. Instead of imposing solutions from without, the method identifies outliers in a community who, despite having no special advantages, are doing exceptionally well. By respecting local ingenuity, proponents say, the approach galvanizes community members and is often more effective and sustainable than imported blueprints.

The concept was first applied to reduce malnutrition in the early 1990s in Vietnam. Since then, programs in developing countries have used it widely for the same purpose. But more recently, it has been invoked in a broader range of contexts, beyond public health, and in rich countries, including the United States. Organizations are now turning to positive deviance to address a dizzying variety of challenges, from female genital cutting and human trafficking in Asia and Africa, to school dropouts and diabetes in the United States. The business world has also begun to take an interest in using the tactic to maximize performance.

In New England, Maine Medical Center has launched an initiative to target Methicillin-resistant Staphylococcus aureus, known as MRSA, a dangerous hospital-acquired infection. Waterbury Hospital in Connecticut, which used the approach to enhance communication with patients regarding medication, has now begun a positive deviance inquiry into end-of-life care. And the Massachusetts Coalition for the Prevention of Medical Errors just received federal funding to fight hospital-acquired infections with the method.

Last year, the Rockefeller Foundation funded a new program at Tufts University’s Friedman School of Nutrition Science and Policy, the Positive Deviance Initiative, to advance the idea. Monique Sternin, who helped pioneer the approach in Vietnam, runs the program, which documents projects and offers training and mentoring. It is also organizing a conference to be held in Bali in January.

In a culture that distrusts self-appointed experts and frowns on imperialism, while lauding community and democracy, positive deviance is a powerful concept. But the approach has important limitations. It can only be used to change behavior – not, for example, as a substitute for government aid or vaccines. It requires a high degree of motivation and commitment. And it is by definition restricted to what is already being done; it excludes brilliant strategies that nobody has tried.

While positive deviance has yielded impressive results in international public health and in American hospitals, it may be too soon to assess how well it can work in other scenarios. Proponents say it’s easiest to implement in cohesive communities, which are relatively uncommon in the United States. And most of the cutting-edge experiments here are too new to be evaluated. Even much of the available data should be taken with a grain of salt. Positive deviance is often employed in conjunction with other improvement efforts, making its impact difficult to isolate.

But the core idea – relying on local wisdom, capitalizing on the hidden genius of ordinary people – is deeply appealing. Unlike a lab experiment or “best practices” transferred from elsewhere, the strategies have already been shown to work in context. As it takes root, positive deviance could instill a new way of looking at hard problems.

In 1991, Sternin was sitting in the home of a poor Vietnamese rice farmer when she spotted a crab. She and her husband, Jerry, were working with Save the Children to combat malnutrition in rural Vietnam. (Jerry died in 2008.) Malnutrition was widespread, and attempts to solve the problem, such as importing food, had proved unsustainable or insufficient.

The idea of examining thriving outliers had been floating around in the nutrition literature since the 1960s. A 1990 book by Tufts professor Marian Zeitlin, “Positive Deviance in Child Nutrition,” expanded on the notion. Inspired by this book, the Sternins decided to apply its insights in practice. In Vietnam, they would identify children who had somehow managed to be well nourished. Then they would try to figure out what those families were doing right.

During this process, which Monique Sternin refers to as a “treasure hunt,” the Sternins went to the families’ homes, looked closely for clues, and asked many questions. One home did not even have full walls, but it housed healthy children. Seeing a crab crawling out of a basket, Sternin said, as she recently recalled, “Oh! What about that? Do you by any chance feed your children crab?” Reluctantly, the father admitted that yes, he scavenged for shrimp and crabs while he was farming in the rice paddies.

“These are protein bombs,” says Dirk Schroeder, a professor of global health at Emory University who later conducted a study showing the project’s effectiveness. “When parents were first asked, they were really embarrassed about it. It was considered a low-class food, rather than buying Nestle baby food in a jar. In fact, it was a perfect thing to do.”

This Vietnamese father was one of the “positive deviants” identified by the Sternins. Other strategies emerged too: distributing the available food into more portions; keeping chickens outdoors, which is more hygienic. Once these behaviors were discovered, the outliers shared them with their neighbors. They all ate together at the homes of the positive deviants. “As the price of admission you would have to bring shrimp,” Sternin says. The community developed its own system for weighing and monitoring the children. Based on encouraging early results, this pilot project was expanded to other villages.

When the two-year intervention ended, rates of malnutrition had declined substantially. One evaluation found that in four of the communities, severe malnutrition had dropped from 23 percent to 6 percent. The change was durable: When Schroeder and his colleagues conducted a study three years later, they found that children in participating villages were doing better than their counterparts in a similar village. Strikingly, younger children, who were born after the initiative concluded, enjoyed an even more pronounced edge than their older siblings.

According to one of the Sternins’ favorite mottos, “it’s easier to act your way into a new way of thinking than to think your way into a new way of acting.” Facilitators try to introduce new behaviors instead of trying to change minds. Once people see the value of these strategies, they revise their views. Often, the positive deviants are unaware of the benefits of their habits, and are, in fact, ashamed of them because they violate cultural norms. The approach also works best, ironically, with the most formidable problems, perhaps after other solutions have failed, because the community must be highly motivated to solve the problem, Sternin says.

After the success in Vietnam, the approach to malnutrition quickly attracted the attention of USAID, UNICEF, and many nongovernmental organizations. It has been used in Guinea, Mozambique, Haiti, Bolivia, India, and Tajikistan, among other countries. Intrigued by the results, the Sternins and other groups began to appropriate the idea for different ends. They have reported progress in preventing female circumcision in Egypt, lowering elementary school dropout rates in Argentina, and reintegrating girls who had been abducted by rebel forces in Uganda.

In the United States, the most celebrated triumph has come in the fight against MRSA. The bacteria cling to surfaces and can live without a human host for days. MRSA kills approximately 19,000 people annually and sickens many more, and the vast majority of victims contract the disease as a result of their stays in medical facilities. In a hospital, everyone – doctors, nurses, patient transporters, and maintenance workers – plays a role in contributing to (or stopping) its spread.

Previous tactics to reduce MRSA rates consisted of educational campaigns, posters, pamphlets, and lectures. A VA hospital in Pittsburgh tried a model based on Toyota’s methods of maximizing efficiency. In this system, designated problem-solvers receive intensive training and other staffers consult them. Dr. Jon Lloyd, then the medical adviser for the Pittsburgh Regional Healthcare Initiative, recalls that these efforts achieved gains, but progress was slow and the training was costly.

In 2004, Lloyd came across an article about the Sternins in Fast Company and instantly thought to borrow their brainchild. In 2006, pilot projects were launched at six American hospitals, including Billings Clinic in Montana and Albert Einstein Medical Center in Philadelphia. By 2008, all of these hospitals had demonstrated remarkable success. According to data from the Plexus Institute, a nonprofit that collaborated on the initiative, reduction rates ranged from 32 percent to 83 percent. Since then, about 30 more US hospitals have introduced positive deviance inquiries.

In places known for their rigid pecking orders, these interventions often disrupt long-established dynamics: Managers listen and ask questions, and the support staff generate many of the answers. At Albert Einstein, for example, a patient transporter named Jasper Palmer had a technique for removing his gown, balling it up into a small package, and stuffing it inside his inverted gloves for disposal. A highly effective way of thwarting germs, it has since been deemed the Palmer method and widely adopted. The hierarchies reportedly lessened their hold in other ways, too: Nurses, for instance, began to feel more comfortable reminding doctors to wash their hands.

“They’ll walk up and say, ‘Here, let me help you, let me squirt some Purell on your hands’,” says Lloyd, who is now a senior clinical adviser at the Plexus Institute. The efficacy of positive deviance, he believes, is “related to the issue of ownership. The solutions tend to last longer because it’s just human nature that we don’t turn our backs on what we create.”

While some of the solutions are ingenious, many are actually quite obvious, but for whatever reason they were previously ignored. It’s the process of engagement and mobilization that seems to enable people to change their behavior. The same principles pertain to other areas, such as education. Starting in December 2008, Asbury Park High School in New Jersey has been working to ferret out positive deviants among teachers and students with the goal of improving academic performance. They discovered that a few teachers greet students at the door to foster a friendly classroom atmosphere and build relationships. Some students, they found, had strong adult role models in their lives, and they started a mentoring program to replicate that asset for others.

It’s too early to say whether the changes will boost test scores, but Christine DeMarsico, a teacher at the school, says the undertaking has already had beneficial effects on the school community.

“Through the process, you’re having dialogue,” she says. Typically, by contrast, “Teachers are pretty much islands.”

At bottom, positive deviance amounts to simple common sense. But that may be what’s most revelatory about it. Instead of throwing money at a problem or devising grand solutions, it urges us to look a little more closely at what’s already happening. As the approach starts to be used increasingly in the United States – especially in health care, but also in education, to reduce gang violence, and to promote exercise – many creative solutions will no doubt emerge. And more unwitting experts, like Jasper Palmer, will get their due. “Here I am,” Palmer says, “becoming a big star.”

Don’t Sweat the Invasion

Posted on: November 4th, 2009 by Rebecca Tuhus-Dubrow

Tamarisk, a Eurasian shrub, is your classic invasive species—designated one of America’s “least wanted” plants by the National Parks Service. In recent decades, it has spread along Southwestern riverbanks, replacing native trees such as willows and cottonwoods. For nature lovers in the region, tamarisks (also known as saltcedars) rank somewhere between Land Rovers and James Inhofe. Measures to thwart them include burning, herbicides, and “tammy whacking” (physical removal sometimes done by freelance volunteers). A few years ago, the USDA let loose thousands of leaf-eating Asian beetles in order to sic them on tamarisks, which die from the defoliation.

But these efforts to oust the intruder have encountered a glitch. It turns out that a charismatic endangered bird—the southwestern willow flycatcher—is known to nest in the offending shrubs. Last March, the Center for Biological Diversity sued the government, charging that indiscriminately killing tamarisks jeopardizes the flycatcher. A recent article in the journal Restoration Ecology goes even further in the weed’s defense. Tamarisks are widely thought to hog water and drive out other vegetation, but the authors dispute that theory. In their view, the newcomer may just be better suited than the natives to an environment altered by human activities.

These controversies highlight a broader debate within “invasion biology,” a field that emerged in the 1980s. Some scientists—such as Matthew Chew, Dov Sax, and Mark Davis—are challenging what they consider old prejudices about “alien” species. They point out the inevitability of change and the positive roles that non-natives can play in ecosystems, while describing eradication projects as often wasteful and even counterproductive.

The outlook of the more traditional camp goes something like this: While most non-natives are harmless, a minority—about 10 percent—cause serious damage. Some pose threats to human health (H1N1 is an invasive species), while others, such as agricultural pests, wreak economic havoc. A third category dramatically changes landscapes. (Think kudzu, the creeping vine that has conquered the Southeastern United States.) And since these effects are unpredictable—sleeper species can seem benign for decades—all exotics are suspect. The hard-liners promote a “guilty until proven innocent” approach to biological foreigners, including a strict “white list” of those allowed to enter each country. As globalization accelerates the movement of species, vigilance is more important than ever. Climate change adds another wildcard, making the behavior of organisms all the harder to foresee.

Now a growing contingent of scientists is advocating a more neutral attitude. Certainly, they say, non-native plants and critters can be terribly destructive—the tree-killing gypsy moth comes to mind. Yet natives such as the Southern Pine Beetle can cause similar harm. The effects of exotics on biodiversity are mixed. Their entry into a region may reduce indigenous populations, but they’re not likely to cause any extinctions (at least on continents and in oceans—lakes and islands are more vulnerable). Since the arrival of Europeans in the New World, hundreds of imports have flourished in their new environments. Common wildflowers such as Queen Anne’s lace and certain kinds of daisies are “naturalized” aliens. The storied apple tree originally hailed from Asia.

Even when species are destructive, there’s the tricky question of what to do about them. We may all agree that a particular plant or animal is loathsome, but eradication isn’t innocuous. Such plans tend to cost millions of dollars and often rely on toxins that bring collateral damage to other species. Biological solutions—like leaf-eating beetles and root-boring weevils—are usually considered more benign, but as the case of the tamarisk shows, they, too, can pose problems. Once an ecosystem has absorbed a new species, any targeted intervention is likely to have significant ripple effects.

The past few years have also seen active debates about the field’s terminology. Some criticize the term “invasive” as too value-laden and imprecise. One widely cited definition comes from a 1999 executive order, issued by President Clinton, that aimed to ward off “alien species whose introduction does or is likely to cause economic or environmental harm or harm to human health.” But what exactly constitutes “environmental harm”? It’s easy to see when a power plant is hemorrhaging pollutants into the air and water—but how do you make the call when the agent of harm to nature is … also nature? Critics such as Mark Davis say we need to distinguish “harm” from “change.” If an introduced species causes natives to become less abundant, does that constitute harm? Many people would say yes, but Davis doesn’t think so.

There’s an argument that even the dichotomy between “native” and “non-native” is ultimately meaningless. Species have always migrated; to identify one as native is to draw an arbitrary line in time. Davis favors a continuum, using labels such as “long-term resident” and “recently arrived”—the idea being that these terms are both more accurate and less loaded.

A number of scientists and other scholars have wondered whether all the fretting about immigrant species somehow reflects xenophobia. There are some provocative parallels between attitudes toward human immigrants and attitudes toward their floral and faunal counterparts. Philosopher Mark Sagoff has noted the stereotypes—such as aggressiveness and unbridled fertility—that apply to human and nonhuman newcomers alike. Champions of this argument can also resort to a familiar trump card. You guessed it: The Nazis were eco-nativists. Under the Third Reich, policies to exterminate foreign plants echoed the more notorious policies against groups of people.

In the United States today, it’s harder to see a connection. The Minutemen aren’t about to join the battle against yellow star thistle. Then again, a few years ago, an upstart faction of the Sierra Club unsuccessfully tried to take over the board on an anti-immigration platform (on the grounds that new residents in the United States typically expand their carbon footprints). Still, liberal attitudes toward immigrants seem to coincide more often with environmentalism (which, in any case, is not necessarily synonymous with eco-nativism).

A better analogue may be resistance to globalization. As weedy species insinuate themselves throughout the globe, they are often likened to McDonald’s. Conservationists want to preserve their local wildlife just as community activists want to save their mom-and-pop burger joints.

The debate inevitably leads to dorm-roomish head-scratchers. Typically, invasive species are considered those that migrated due to human agency, deliberate or accidental. So are people part of nature or separate from it? If we are part of it, isn’t anything we do “natural”? (The same argument could be made about global warming.) If we’re apart from nature, isn’t anything we do unnatural, including eradicating the species we introduced? Come to think of it, aren’t humans the most invasive species of all?

Not even the most heretical scientists say that we should just let things take their course with no attempts at management. Rather, they argue against reflexive assumptions that non-native species are bad. We should recognize their benefits as well as their drawbacks, acknowledge the trade-offs that come with trying to keep species out in a globalized world, and respect the difficulty of targeting just one element of nature.

Those caveats don’t preclude conservation campaigns. The claims of traditional ecologists are compelling, precisely because they embody values that are widely shared. Most of us would be disturbed by radical changes to our national parks or neighborhood landscapes. We might decide, as a society, that in certain areas, it’s worth the costs of trying to minimize such change. But let’s see these efforts for what they are: expressions of human preferences rather than imperatives that flow directly from the science. Tammy whacking, anyone?

Roadside Diner

Posted on: August 30th, 2009 by Rebecca Tuhus-Dubrow

On a recent afternoon, biking on a desolate stretch of Albany Street in Cambridge, David Craft pulled over. He’d glimpsed something green in the cracks of the sidewalk, near a parking sign. To most passersby, it would be invisible, or perhaps a reminder of the unwelcome tenacity of weeds. To Craft, it was an ingredient.

He picked a few sprigs and examined them: Reddish stems bore bunches of tiny, rounded, tough green leaves. As he’d suspected, it was purslane, one of the most nutritious wild edibles – not only a fine contribution to that night’s planned dinner, but a healthy snack to boot. His companion expressed misgivings about eating it unwashed, given its less than pristine provenance. Craft was unconcerned, although he conceded that “maybe some dogs pissed on it.” His companion – OK, me – not wanting to appear wimpy, duly munched. It tasted rather like spinach, concentrated into much smaller, thicker leaves.

Craft, a bearded, lanky fellow in his mid-30s, has been an enthusiastic “urban forager” for two years. He forages most every day – except when the ground is frozen – and estimates that from April through October, close to half of his diet derives from scavenged foods.

He’s one of a number of city-dwellers throughout the United States who have recently started finding their nutrients on the sidewalk – and in parks, on neighbors’ lawns, and other urban areas. Blogs, sharing tips and adventures, have inevitably followed. Urban Edibles, a “cooperative network of wild food foragers,” was founded in Portland, Ore., in 2006, and similar projects exist in Los Angeles, Tucson, and elsewhere. And the popularity of guided foraging walks has grown. “Wildman” Steve Brill, perhaps the country’s best-known forager, has been giving tours of Central Park for 27 years, and has noticed a spike in interest.

“In 1982, I had two or three people on my tours,” he says. “Now I have over 70. Over the last two years, the increase has been greater than any other time.”

That foraging would be in the zeitgeist makes sense. Increasingly, people are developing a passion for tasty, healthful food. Many wild plants have more vitamins and minerals than their cultivated counterparts, especially since they are more likely to be eaten fresh. At the same time, the ailing economy and concern for the planet have inspired a return to frugality and simplicity. While farming feels like the revival of a more wholesome society, playing hunter-gatherer (or at least, gatherer) feels like going even farther back to our antediluvian roots. It’s the extreme manifestation of locavore chic: What could be more locavorous than picking food around your neighborhood?

We had started at Craft’s home in Cambridge, where he invited me to help myself to some sourgrass, also known as wood sorrel (most wild plants go by multiple names, he informed me), which was growing abundantly alongside the building. A weed that visually resembles clover – three delicate, heart-shaped leaves – it released a burst of pleasingly sour flavor in my mouth, akin to sour candy without the sugariness or the artificiality.

The plan was to bike over to Jamaica Pond, but we could never get far without Craft veering over to inspect some intriguing patch or tree. On Albany Street, he found not only purslane but also lady’s thumb, another edible weed. In a parking lot near MIT, he spotted quickweed and plantain, the latter of which has medicinal uses. “If I had a cut right now, I would certainly rub that on it,” Craft assured me.

Environmentalists love transforming something ostensibly useless or worse into a boon: recycling garbage into new goods, food scraps into rich soil, manure into biogas. Eating weeds falls into this category of resourceful efficiency: Weeding your lawn can double as a grocery run, gratis. Foragers can also establish symbiotic relationships with nearby community gardens. Russ Cohen, a veteran forager in the area, says gardeners are usually delighted to let foragers take wild edibles – i.e., weeds – from their plots. The typical response, he says: “Are you kidding me? Here’s a bag – fill it up.”

From the parking lot, we made our way to the Charles River, where Craft found jewelweed, a plant with flimsy, droopy yellow flowers and narrow green pods. When you squeeze the pods, they burst open, releasing nuggets that taste like walnuts. A little farther, under the BU Bridge, in a space where ducks and geese congregate by the water’s edge, we stumbled into a bounty of blackberries and milkweed. Most of the blackberries were still pale, but we picked the accessible dark ones. Craft saw a ripe berry deep inside the thicket, surrounded by prickers, but he couldn’t resist.

“It’s such a good-looking blackberry – I’m goin’ in!”

For most practitioners, foraging is a kind of hobby, but what if, due to some apocalyptic event, it became a necessity? Some foragers approach the activity from this “survivalist” perspective. Rebecca Lerner, a Portland, Ore.-based blogger and journalist, conducted an experiment in May: She tried to subsist on only foraged food for one week. By the fourth day, she had been able to find mainly root vegetables, such as burdock and wild carrot. She was also drinking a lot of medicinal teas, made of cleavers, pine needles, and pineapple weed, as well as stinging nettles broth. She was spending many hours a day searching for food.

“I was very hungry,” she recalls.

Eventually, desperate for protein, she decided to forage some eggs. Ant eggs, that is. “They weren’t bad,” she says, describing them as “oval-shaped” and “spongy.” However, she says, securing them was “labor-intensive” because “the ants were fighting me and biting me.”

She considered eating slugs, but ultimately declined, and stopped the experiment on day five. She now says she should have done more research in advance, and should have realized that May is not an especially forager-friendly time in Oregon. She plans to try again in early fall.

Foraging does have a couple of ironic downsides. While often done for environmentalist reasons, it has the potential to do ecological harm, if foragers harvest excessively. And while most wild foods offer health benefits, foraging also poses health risks. Pollutants, of course, are everywhere. That said, the consensus seems to be that, with the exception of obviously hazardous zones like industrial sites, the threat is no greater than at your average grocery store (and is more than offset by the rewards). Perhaps a more serious danger is misidentification: In rare cases, poisonous plants and fungi can even be fatal.

Brill, who works with children, warns them about the various risks. “You could also get diarrhea – that brings the lesson home more than death,” he says.

We never made it to Jamaica Pond, but we did cross the BU Bridge into Boston, and on the Emerald Necklace we found some exquisite mushrooms – porcini, according to Craft – a couple of inches in diameter, luminous brown on top, spongy white underneath. After a few hours of foraging, we’d accumulated enough edibles for a meal.

Back at Craft’s apartment, assisted by his friend Shannon, we cooked a dinner almost exclusively of foraged food (the exceptions were oil and salt). We sauteed the mushrooms with a plant called tansy, a tiny bright-yellow flower with a smell and taste reminiscent of sage. We fried up an assortment of weeds – milkweed, evening primrose flowers, plantain, purslane, quickweed, lambs’ quarters, and lady’s thumb. For seasoning, we used peppergrass, a plant with leaves of perfect circles, whose flavor has a kick to it. The wild greens tasted much like vegetables we’re familiar with, such as kale and Swiss chard, although slightly more bitter and therefore appealingly medicinal. The mushrooms, meanwhile, were simply delicious. In my opinion, they would not have been out of place in a swank Manhattan restaurant.

Craft also made iced tea from mint, goldenrod, and St. John’s wort: It was amber-colored and refreshing, though also a touch bitter (from the goldenrod, he explained). For dessert we ate a compote made of blackberries and mint gathered on our excursion, as well as juneberries, crabapples, and mulberries Craft had collected a couple of weeks earlier and frozen. The result was on the tart side, and, in Craft’s words, had “a bit of a mealy crunch.” But the fact that we had found all of the ingredients for free around town somehow improved the taste.

Sued by the forest

Posted on: July 19th, 2009 by Rebecca Tuhus-Dubrow

Last February, the town of Shapleigh, Maine, population 2,326, passed an unusual ordinance. Like nearby towns, Shapleigh sought to protect its aquifers from the Nestle Corporation, which draws heavily on the region for its Poland Spring bottled water. Some Maine towns had acquiesced, others had protested, and one was locked in a protracted legal battle.

Shapleigh tried something new – a move at once humble in its method and audacious in its ambition. At a town meeting, residents voted, 114-66, to endow all of the town’s natural assets with legal rights: “Natural communities and ecosystems possess inalienable and fundamental rights to exist, flourish and naturally evolve within the Town of Shapleigh.” It further decreed that any town resident had “standing” to seek relief for damages caused to nature – permitting, for example, a lawsuit on behalf of a stream.

Shapleigh is one of about a dozen US municipalities to have passed measures declaring that nature itself has rights under the law. And in 2008, when Ecuador adopted a new constitution, it recognized nature’s “right to exist, persist, maintain itself and regenerate its own vital cycles, structure, functions and its evolutionary processes.” A campaign is also underway in Europe for a UN Universal Declaration of Planetary Rights, which would attempt to enshrine such principles in international law, following the model of the Universal Declaration of Human Rights.

These developments are part of a small but growing movement that aims to reorient the relationship between the earth and the law. Advocates argue that natural objects should not be treated as mere property, vulnerable to exploitation or destruction as owners see fit, but as rights-bearing entities with intrinsic value. The Community Environmental Legal Defense Fund (CELDF), a Pennsylvania-based nonprofit, works with communities such as Shapleigh to protect local ecosystems, and more towns are considering ordinances in the same vein. The Center for Earth Jurisprudence, established in 2006, works with two Florida law schools, developing a legal philosophy based on respect for the planet, and seeking avenues in current law to advance that goal.

“Someone needs to be able to represent the rivers,” says Patricia Siemen, director of the Center for Earth Jurisprudence. “Someone needs to be able to represent the forests.”

Of course, the notion will strike skeptics as preposterous. Would we need to worry about offending litigious shrubs? With a boulder, or a swamp, as a witness in the proceedings? Critics dismiss the idea as grandstanding that could clog the courts with frivolous cases.

But proponents see it as part of an ongoing progression, an expansion of rights that slowly brings about an increasingly just society. After all, not so long ago, slaves and women were in some legal regimes deemed property, just as nature is today. Now we all accept universal human rights. The concept of animal rights has also become familiar, if much more contested. Advocates of this agenda see the extension of rights to ecosystems as the natural next step. And they believe it could spark a profound shift in our relations with nature, leading to more effective environmental protections.

“The language of rights has a great deal of currency. It’s the most powerful of our ethical terms,” says John Baird Callicott, a philosophy professor at the University of North Texas. “Rights shift the burden of proof from those who are defending nature to those who want to exploit it.”

In the view of proponents, the idea is less outlandish than it may seem. Other nonhuman entities have long enjoyed certain rights under our legal system: ships and corporations are two examples of entities entitled to “personhood,” meaning they can bring lawsuits to court. What’s more, proponents say, the extension of rights invariably seems absurd before it happens. When the economy depended on slave labor, emancipation was unfathomable even to many who abhorred slavery. In retrospect, though, it seems morally imperative and historically inevitable.

Yet bestowing rights on nature poses considerable practical and philosophical challenges. In the case of the declarations in towns like Shapleigh, it isn’t always clear how they will be enforced. (So far, Nestle has not attempted to set up operations in Shapleigh, but it’s hard to say whether that is a result of the ordinance.) Granting standing – the ability to sue in the name of a natural object – is a more modest, specific goal, but stipulating “inalienable rights” strikes some legal experts as both vague and infeasible. Critics also argue that because the language of rights is indeed potent, we ought to be wary of diluting that force by spreading rights too thin. And they question whether the concept of rights and interests can be applied to nature in any meaningful way.

“All the interests in nature conflict. Trees fight each other for sun and water,” says Mark Sagoff, an environmental philosopher at the University of Maryland. “Granting rights to nature would just be a distraction from the policy progress we’ve made.”

The debate ultimately centers on the basis of legal rights. Historically, they have been strongly associated with human beings. All of the formerly rightless entities who now seem so clearly deserving of rights – infants, for example, or women, or African-Americans – share one conspicuous trait: they’re people. (Corporations and ships, it could be argued, represent conglomerations of people.) When extended to animals, rights have often been based on affinities with humans: sentience, the ability to suffer. The question is how starkly we distinguish between human and nonhuman life. Is membership in the biosphere alone enough to merit rights?

The notion of nature’s rights has long been cherished in environmentalist circles; the idea cropped up in the writings of Sierra Club founder John Muir in the late 19th century and the influential ecologist Aldo Leopold in the mid-20th century. But the first sustained legal argument is usually attributed to Christopher Stone, a law professor at the University of Southern California. In 1972, Stone wrote an article entitled “Should Trees Have Standing?”, which laid out the case for expanding rights that is now commonly cited. (The essay, originally published in the Southern California Law Review, will be reissued by Oxford University Press in 2010.)

Stone lamented that although one could sue to protect nonhuman life, one had to prove “injury” to humans. Damages, when awarded, went to compensate the human plaintiff, not to restore the natural object. He argued that natural objects themselves should be eligible to be plaintiffs (represented, of course, by human trustees or guardians). Furthermore, the natural objects should benefit directly from a favorable judgment – funds should go to restoring the damage wrought. Stone drew an analogy to the legal status of “incompetents,” such as children or senile elders, who may not be able to articulate their interests: guardians can make informed judgments about those interests and represent them in court.

As it happened, a highly pertinent case was before the Supreme Court at the time. In Sierra Club v. Morton, argued in 1971, the Sierra Club tried to stop Walt Disney Enterprises from building a ski resort in a pristine California valley called Mineral King. The Court decided that the Sierra Club itself lacked standing, although it could sue on behalf of its members, who could claim they suffered recreational or aesthetic injuries (for example, from the lost opportunity to hike in the area).

Serendipitously, Justice William O. Douglas had been slated to write the preface for an issue of the Southern California Law Review, and Stone had rushed his article into that issue, hoping that the justice would read it. The strategy worked: Douglas dissented, echoing Stone’s thesis. “Contemporary public concern for protecting nature’s ecological equilibrium should lead to the conferral of standing upon environmental objects to sue for their own preservation,” he wrote. “This suit would therefore be more properly labeled as Mineral King v. Morton.”

For a time, the idea appeared to gain some currency. In 1973, the Endangered Species Act became law, including a provision for “citizen suits” on behalf of listed species. The provision, Professor Callicott has argued, grants de facto standing to the endangered wildlife (although this view is controversial). In any case, the law implicitly recognized the worth of life that has no instrumental use for people.

In 1974, Laurence Tribe, the prominent Harvard law professor, elaborated on Stone’s reasoning in an article for the Yale Law Journal. He wrote that the legal system’s focus on human injuries reinforced anthropocentric values, creating a vicious circle that could further increase callousness to other life forms: “What the environmentalist may not perceive is that, by couching his claim in terms of human self-interest – by articulating environmental goals wholly in terms of human needs and preferences – he may be helping to legitimate a system of discourse which so structures human thought and feeling as to erode, over the long run, the very sense of obligation which provided the initial impetus for his own protective efforts.”

In 2008, Francisco Benzoni, then a business professor at Duke, published an article in the Duke Environmental Law and Policy Forum, citing Tribe’s paper and reviving the point. “The current jurisprudence on standing embeds a value theory without any articulation or discussion about whether that’s the value theory we should adopt,” says Benzoni.

In the intervening years, a number of lawsuits have named nonhumans, usually animals, as plaintiffs. The rulings have been inconsistent. In one oft-cited case, Palila v. Hawaii, in 1988, the Ninth Circuit Court of Appeals explicitly endorsed the standing of a bird, writing that it “has legal status and wings its way into federal court as a plaintiff in its own right.” In 2004, however, the same court (but different judges) dismissed that statement as nonbinding “rhetorical flourishes.”

The need to frame arguments in terms of their human effects has led to some almost comically contorted claims. In Animal Welfare Institute v. Kreps, in 1977, several environmentalist groups sued to stop US firms from importing baby sealskins from South Africa, asserting that their members suffered aesthetic, recreational, and educational losses from the brutal deaths of the seals. One of the members announced a plan to visit South Africa. Remarkably, the groups won the case on appeal. But some who applaud the outcome question the method.

“Oh, for Pete’s sake, just sue in the name of the seals,” says Stone, the author of the seminal paper on rights for nature. “The seals are being bludgeoned to death and somebody’s saying, ‘I want to be seeing seals.’ That’s not what it’s about. It’s a very backwards way of getting the case into court.”

Some champions of nature’s rights see a glimmer of promise in a recent ruling. In the 2004 case Cetacean Community v. Bush, about the effect of the Navy’s use of sonar on whales and dolphins, the Ninth Circuit, which is one level below the Supreme Court, denied standing to the creatures. However, the opinion left an opening, noting that “nothing in the text of Article III [of the US Constitution] explicitly limits the ability to bring a claim in federal court to humans.” It would be up to Congress, the judge suggested, to stipulate that the nonhuman life under a law’s protection has standing to sue. Some environmentalists, such as the staff at the Center for Earth Jurisprudence, now hope Congress can be persuaded to do just that – and their ideal legislation would not be limited to animals, either.

Among scholars with environmentalist sympathies, there is vigorous debate over whether standing for natural objects is the most sensible approach to defending ecosystems. After all, it’s possible to enlarge the scope of our concern and protection without granting legal rights per se. Rights advocates contend that presenting legal cases in terms of human impacts is too anthropocentric, but critics invert that logic. They say we are projecting onto nature our assumptions about its interests. Ultimately, in their view, even the most radical environmentalist embodies human values, and we should just say so.

Richard Stewart, a law professor at New York University, believes that inanimate objects such as trees and rivers do not have interests or values. Rather, he says, the argument really concerns “human ideas about what’s good for nature.”

The distinction can be subtle. It doesn’t mean we must diminish the worth we assign to nature; it just means acknowledging that we as a society are assigning the value. We could, for example, liberalize standing for humans – make it easy for people to sue to protect nature, without granting official standing to the natural objects. If we could sue to preserve a valley because developing it offends our moral sensibilities, this would indicate that nature matters beyond its strictly instrumental uses. But, according to this perspective, it matters to us humans, not in some transcendent way that is independent of our judgments.

Indeed, some critics ask, how do we know what nature prefers? Perhaps Mineral King wanted to host a ski resort, Mark Sagoff has suggested; perhaps a beach wants to welcome tourists, or a river wants to make electricity. As Sagoff puts it, “Old Man River might want to do something for a change, other than just rolling along.”

Supreme Reforms

Posted on: May 24th, 2009 by Rebecca Tuhus-Dubrow

When Justice David Souter announced his retirement earlier this month, some commentators cast the decision as another sign of his admirably eccentric character. Sixty-nine is seldom a remarkable age for retirement, but in the context of the Supreme Court it’s downright tender. In recent decades, justices have served increasingly long terms, and turnover has slowed. From 1994 to 2005, there were no changes at all in the court’s composition – the second-longest such interval in its history.

Long tenures arguably have benefits – stability can be an asset, and the justices are afforded the opportunity to master their jobs. But court-watchers have started to worry that marathon terms also create serious problems. The trend raises the stakes of every appointment, and justices may serve past their prime. More fundamentally, each justice is one of the most powerful individuals in the country, and the sheer concentration of power for such sustained periods strikes many observers as undemocratic.

Critics in legal circles point to epic terms as one of several problems that sap the court’s dynamism and responsiveness, and which necessitate significant reforms to this hallowed institution. They propose ways to increase the frequency of appointments, to induce justices whose faculties are waning to retire, and to compel the court to hear cases it is currently free to ignore. For the most part, these critics do not point to specific misdeeds. But they do argue that the court would benefit from regular oversight from the other branches of government, and from occasional infusions of democratic reform.

“The overall point of this is that it’s been a very long time since the executive branch and Congress have considered the role of the Supreme Court,” says Alan Morrison, a dean at George Washington University Law School and previously the director of the Public Citizen litigation group. “There ought to be debate and discussion.”

In February, a group of leading scholars and former judges sent a letter to Vice President Joe Biden, Attorney General Eric Holder, and Senator Patrick Leahy, among others, advocating a “Judiciary Act of 2009” that would implement a list of important changes. The ideas were explored at an academic conference at Duke in 2005, and in a subsequent book, “Reforming the Court.” The proposed reforms include requiring the chief justice to sound an alert if another justice’s abilities are declining. Another is to reduce the court’s control over its docket, appointing a group of appellate judges to assign it cases. A third is term limits for members to serve as chief justice, a position with even more power than the others.

Others in the legal world, however, see no need for reforms. For all its imperfections, they say, the court is functioning pretty smoothly. For better or worse, its decisions have been relatively in line with public opinion. And even supporters concede that reforms will be difficult to enact, especially at a time when Congress has so much else on its plate.

But as President Obama selects his nominee to succeed Justice Souter, it’s an opportune time to think about exactly what the Supreme Court and its justices should do – and to remember that these rules, for the most part, are not set in stone, or in the Constitution

Reluctance to leave the Supreme Court is understandable: it’s hard to imagine a more enviable job. It offers tremendous intellectual gratification and more influence than most elected offices, without the need to hustle for reelection. The court is relatively free from interference from the other branches of government, and the justices get their pick of the country’s most brilliant young law school graduates as assistants.

But the seats were not always so sought-after. The court once had to hear any cases that were brought before it. Until 1891, justices were also required to “ride circuit,” to travel to the lower courts to hear oral arguments, which was thought to keep them in touch with the country. It was an onerous duty, especially in the days before cars and planes. In the nation’s early days, it was not uncommon for justices to quit after several years for a better gig.

Over time, though, the job became increasingly cushy. William Howard Taft, the president and subsequently chief justice, made it something of a mission to transform the court. As of 1925, it largely gained control over its own docket. (Under Chief Justice Rehnquist, the court gained almost total discretion.) The justices were eventually granted the right to four clerks, who summarize petitions, research opinions, and often write first drafts.

Changes in society have also affected the court – notably, longer life spans. When the Constitution was written, life expectancy was far shorter than it is today. As longevity has risen, along with the job’s power and prestige, terms have correspondingly lengthened. Since 1971, according to calculations by Duke law professor Paul Carrington, the average tenure jumped from 16 years to 25.5 years, and the average age on leaving office rose from 70 to 79.

The most obvious problem arises when justices serve into their senescence. In 2000, Emory University law professor David Garrow (now at the University of Cambridge) published a study called “Mental Decrepitude on the U.S. Supreme Court.” Thurgood Marshall, Joseph McKenna, and Hugo L. Black are just a few examples of justices who provoked concern about their dwindling capacities as they aged. In his last years, Oliver Wendell Holmes sometimes fell asleep on the bench; after a stroke in 1974, William O. Douglas addressed people by the wrong names and cast votes inconsistent with his previous views.

Another issue is the irregularity of appointments: Nixon made four, while Carter made none. Some argue that while the court should not be directly accountable to the public, voters deserve to influence appointments in every election. The system also means that justices often base the timing of their retirements on who is in office, distorting their decisions. (In some cases, this at least partly explains why justices have stayed despite disability.) It also distorts decision-making for presidents: the trend is to appoint ever younger justices, whose comparative lack of experience may not be ideal.

“We are better off with justices who have been practicing for 25 or 30 years,” says Morrison. He also wonders whether decades on the court make the justices “less in touch with the people as a whole and less in touch with the political forces that nominated them.”

As a result of these concerns, the suggestion of term limits – 18 years is a commonly cited figure – has been tossed around. So has a mandatory retirement age – for instance, 75. Both are common in the rest of the world, but both would require a Constitutional amendment, a prohibitive obstacle. (Interestingly, the Constitution doesn’t explicitly stipulate life tenure; it says justices will serve during “good behavior,” which has been interpreted to mean that.)

A more recent, innovative plan, outlined in the February letter, would aim to achieve the same goals while circumventing the need for an amendment. Presidents would make appointments to the court every two years. If the number of justices exceeded nine, the nine junior justices would hear cases. The senior justices could substitute in the event of a recusal or disability – a meaningful bonus, since no one is now eligible to step up in those instances. The senior justices could also remain involved in other court functions, such as choosing which cases to take. Leaving life tenure technically intact, this approach would inject fresh blood into the court and introduce regularity to appointments.

“It’s a form of democratic responsiveness,” says Jack Balkin, a professor at Yale Law School who supports the plan. “The Supreme Court is not supposed to be directly responsible, but there are supposed to be democratic checks.”

Another idea is a mechanism for urging and, if necessary, compelling justices to resign once their abilities are in decline. The chief justice would have the duty to advise another justice when the time has come to retire, and to report to the Judicial Conference of the United States. When a chief justice is no longer able to perform, other justices would be obligated to report that to the same body, which would have the authority to refer the matter to the House’s Judiciary Committee. Congress could then impeach if necessary. Despite the obvious potential for awkwardness, many believe this policy is overdue.

“There ought to be a way of telling over the hill justices that they are over the hill,” says Carrington.

The court’s docket presents other concerns. The number of cases the court takes has fallen to about 70 a year, from about 300 when it first gained a measure of discretion. The justices often opt to take the more interesting and gratifying cases, leaving mundane conflicts between the lower courts unresolved. Adjudicating these disputes – providing supervision to the lower courts – is supposed to be an important part of the court’s role.

“These guys aren’t doing an honest day’s job,” says Carrington. “The court’s not providing anywhere near enough leadership.”

Carrington and others propose giving assignments to the court – establishing a body of appellate judges to choose some of the cases it hears. (An alternative is to simply require the court to take a certain number of cases per year.) Such a reform would have the incidental effect of making the job harder, which some view as a benefit, because it would encourage older justices to move on.

Part of what sets the judiciary apart, of course, is its position above the fray. It is, in theory, unresponsive to polls, immune to political pressures. One less directly representative branch of government was conceived as a crucial part of the checks and balances that preserve democracy as a whole. This is a powerful rejoinder to arguments about the court’s seemingly undemocratic aspects.

But another, perhaps contradictory reply is that the court does, on the whole, respect the views of the country. The idea that the justices are out of touch is not supported by its record, according to Barry Friedman, a law professor at NYU and author of the forthcoming book “The Will of the People: The Supreme Court and Constitutional Meaning.” He argues that, at least for the past 60 years, the court has hewed rather closely to public opinion. Historically, when it strayed too far from the mainstream, it was punished in various, ad hoc ways. Congress has exercised its power to withhold raises and restrict budgets. When the Supreme Court blocked New Deal legislation, Roosevelt tried to pack the court with his own appointees; Congress has sponsored bills to strip the court of jurisdiction in some cases. While these more dramatic efforts have usually amounted to mere threats, Friedman believes they have done the trick: over time, justices got the message.

“I tend to be of the view that if it ain’t broke, don’t fix it,” says Friedman. “I think the court is politically accountable. It’s not clear to me that if you switch to this system, we’d do differently.”

In the same vein, ensuring that appointments are evenly distributed among presidents wouldn’t necessarily affect the court’s ideological balance. Justices often surprise or evolve over time. Souter is the classic example of a justice who turned out to hold different views than the president who appointed him (George H.W. Bush), bitterly disappointing conservatives.

There are other criticisms of the proposals, too. Age is only one factor in mental prowess: throughout his 80s, Justice Stevens, for example, has certainly exhibited greater acuity than many people half his age. What’s more, some of the ostensible problems have upsides. The lighter workload comes with benefits – the court was previously overwhelmed with cases – and there are obvious advantages to making such an important position enjoyable.

But critics say the court should no longer be seen as a sacrosanct, untouchable institution. There will always be disagreements about how, precisely, to construct ideal governmental structures. But a common American feeling is that power held too long, even if benignly exercised, carries an undemocratic whiff. Although Souter voluntarily relinquished his power, he seems unlikely to start a trend. Prudent reforms, Carrington says, “would make the justices a little more aware of the fact that they are mortal human beings.”

Rethinking rent

Posted on: March 22nd, 2009 by Rebecca Tuhus-Dubrow

IN THE SOUL-SEARCHING sparked by the financial meltdown, Americans have started to look askance at some of the habits and policies that had come to define our country. Excessive consumption and living on credit are no longer seen as acceptable, let alone possible. “Deregulation” is suddenly a dirty word.

Yet despite the housing crisis, one value, more deeply entrenched, remains sacrosanct: homeownership. Irresponsible mortgages have been universally condemned, but it is still widely assumed that we all aspire to own homes – and that we all should aspire to own homes. Homeowners are thought to be more engaged in their communities and to take better care of their houses and neighborhoods. On a nearly subconscious level, buying a home is a central part of the American dream. A picket fence may now be dispensable, but a house of one’s own is seen as the proper place to raise an American family – a prerequisite for stability, security, and adult life. And for decades – but increasingly under the Clinton and Bush administrations – federal policies have encouraged citizens to achieve this goal.

But a growing chorus of economists and housing experts say that this mind-set, too, needs fundamental reform. Owning a home is not right for everyone, they say: In some ways it’s overrated, and it can even have harmful effects for individuals and society. It is now glaringly clear that buying a home is a financial risk, not the surefire investment it is often perceived to be. Widespread homeownership may also have a negative impact on the economy, because, among other reasons, displaced workers can’t easily relocate to new jobs. And some of the alleged rewards of homeownership, such as greater self-esteem, health, and civic engagement, have been called into question by research. The government, critics argue, should focus on ensuring high-quality, affordable housing rather than promoting homeownership for its own sake.

“There’s no reason we should all be homeowners,” says Joseph Gyourko, a professor at the Wharton School of Business and coauthor of “Rethinking Federal Housing Policy.” “Homeownership has a lot of benefits, but it has costs, too.”

According to this view, renting offers many advantages, and should be considered a viable long-term option for people of all ages and socioeconomic levels. Renters enjoy flexibility and freedom from the responsibilities of maintenance. Given the often overlooked costs and risks of homeownership, renting is in many cases a wise financial choice. And the experience of a place like Switzerland – a well-functioning country with only about a 35 percent homeownership rate – suggests that rental housing per se does not unmoor society.

Some analysts propose abolishing or limiting the mortgage interest tax deduction, which provides substantial tax breaks for homeowners. Others favor greater security for renters – such as laws making eviction more difficult – or tax deductions for renters, which a few states, such as Massachusetts, already offer. The MacArthur Foundation has launched a major initiative to preserve affordable rental housing in 12 states, including Massachusetts. In the recent stimulus legislation, advocates of renting successfully fought additional incentives for homeownership, and they continue to push for a “balanced” housing policy.

Certainly, homeownership still has staunch defenders. Many stand by its psychological and social benefits, and consider a home to be a sound investment. “It’s the single most significant source of wealth and most secure source of savings” for Americans, says Jim Carr, chief operating officer for the National Community Reinvestment Coalition. In his view, the real problem has been the dysfunctional market, which we should not conflate with homeownership itself. “In a well-regulated market, homeownership provides a nest egg and an important generational wealth transfer,” he says.

Nearly everyone agrees that homeownership is sometimes the right choice: for people with the means, who intend to stay put for a long time and want to customize their houses, it probably makes sense. And the mortgage interest deduction would be difficult to eliminate for political reasons. Yet more and more experts are saying that if we could unsentimentally see the pluses and minuses of each option – and if we had a level playing field in terms of government support – the sensible decision for many Americans would be to refrain from buying a home, perhaps for their whole lives.

Owning a house, we tend to think, is a quasi-magical boon that provides a broad array of goods for individuals and families. There is a logic to the reasoning: If you are financially invested in a neighborhood and expect to stay there for years, you’ll be more inclined to tend a garden, bake muffins for your neighbors, and follow developments that affect the community. You have more control over your environment, which seems likely to yield psychological benefits. Studies over the years have suggested that homeowners are healthier and happier, and even that their children perform better in school.

But other research has challenged these conclusions. Some studies find no significant relationships between these desirable outcomes and homeownership. And according to several reviews of the literature, many of the studies suffer from methodological flaws: Most importantly, they fail to control adequately for other variables, such as income, age, marital status, and home value. In other words, homeowners tend to do better on a range of measures, but that doesn’t mean that the ownership status is the cause; homeowners tend to be older and wealthier, which could account for the differences. And there is little research on the potential negative consequences for low-income families facing the burden of mortgage debt and the possibility (and reality) of foreclosure.

A recent study, which aimed to avoid the problems of previous research, suggests that homeownership confers no real benefits. The study examined self-respect, perceived notions of control, time spent with friends and family, volunteer activities, and enjoyment of the neighborhood, among other things. On all of these measures, after controlling for income, health status, and home value, the study found no significant advantage for homeowners. In fact, homeowners were on average 12 pounds heavier, and they spent less time with friends. They also reported more “pain” – the term used in the study’s survey – deriving from their homes than renters. Grace Wong Bucchianeri, an assistant professor at the Wharton School, conducted the study in 2005, at the top of the market. (All of her subjects occupied single-family homes, so the only difference was ownership status. But her study had limitations too: all of the subjects were women, and it was geographically confined to Ohio.)

“It challenges our notion of engaged, active, healthy homeowners,” says Bucchianeri. “We’re not looking at a lot of benefit here.”

Some of the more concrete financial drawbacks of homeownership should be obvious to any homeowner. There are high transaction costs: 7 to 10 percent of the cost of the house goes to the process of buying and selling. Between “sprucing it up and legal fees, it’s not cheap,” says Eric Belsky, executive director of the Joint Center for Housing Studies at Harvard. Major repairs, sometimes unexpected, may be needed, and simple upkeep usually costs 2 to 4 percent of the house price per year. These responsibilities can be a hassle as well as an expense. Renting, by contrast, is “inherently efficient,” as Matthew Perrenod of the Housing Partnership Network argued at a housing conference at Harvard in mid-March, because the maintenance can be professionalized. Owning also, of course, comes with the risk of losing money if you have to move during a down market, or of being tethered to a house you want to leave.

Renting is often derided, especially by real estate agents, as “throwing money away.” But upon close examination, the best financial decision is far from clear, and it’s impossible to generalize. Buying entails paying property taxes, as well as the cost of repairs, mortgage payments, and mortgage interest.

Some financial analysts say that in terms of dividends, individuals would often be better off renting and using the money saved to invest in stocks. According to the National Multi-Housing Council’s 2008 annual report, a $100 investment in a home in 1985 would have paid off $210 by 2008, but if the same amount had been invested in stocks, it would have grown to $710. (Of course, the recent drop in stock values likely narrowed the gap.)

Some of the advantages of homeownership may be double-edged. Stability has virtues, but the flip side is inflexibility. Especially for low-income homeowners, owning a home can become a trap, preventing them from escaping distressed neighborhoods. For people at all income levels, owning a home may keep them from moving to where jobs are.

In the mid-1990s, Andrew Oswald, a British economist at the University of Warwick, began to notice a correlation between national rates of homeownership and unemployment. Among industrialized nations, Spain had the highest rates of both, while Switzerland had the lowest rates of both. Other variables, such as the generosity of the welfare state, didn’t seem to matter nearly as much. He believes a high homeownership rate undercuts the efficiency of the economy: not only does it contribute to joblessness, but workers may take jobs for which they are not ideally suited, based on location rather than skills.

“There’s been a presumption that it’s really good for a country to have a high rate of homeownership,” says Oswald. “But that homeownership equates with inflexibility.”

The evidence is mixed on whether homeowners are more civically engaged than renters. But to the extent that they are, their influence in some cases has undesirable societal repercussions. Since houses are the major asset for so many families, homeowners naturally want to protect their property values. This often leads to zoning laws that make it difficult to construct commercial or additional residential buildings. Such laws erect barriers to entrepreneurs and reduce overall housing affordability.

Moreover, homeowners are likely to have longer commutes than renters, as New York Times columnist Paul Krugman has pointed out. That’s because they buy houses on inexpensive land, farther from city centers; this contributes to sprawl and congestion. As awareness has grown about climate change, more and more analysts are citing environmental factors as a reason to prefer renting and the relative density that typically goes along with it.

Given the mixed evidence for the benefits of homeownership for both individuals and society, does it warrant major government promotion?

The mortgage interest deduction has aroused widespread criticism. According to many detractors, this tax break does not even foster homeownership; it merely encourages the affluent to buy bigger houses. “When they go to deduct interest, it has a more powerful effect for them,” says Belsky. “They’re buying more house.” For this, the government forgoes large amounts of tax revenue. President Obama’s budget proposes reducing the deduction for the top tax bracket, although this provision may be difficult to pass.

Some believe any subsidy for homeownership is misguided, given the uncertain evidence for its benefits, and the indications of downsides. Low-income people may be seen as most deserving of support in buying homes, but in some ways they also stand to lose the most, as has become clear in the recent crisis. People without stable jobs or family situations may find it difficult to meet the responsibilities of homeownership, and efforts to help them buy homes could backfire by ending in foreclosures.

A number of proposals have been advanced by critics of the status quo. Dean Baker, codirector of the Center for Economic and Policy Research, advocates greater protection for renters, so that they are ensured certain standards of quality and “some security,” he says, “so that your landlord can’t just say ‘I want you out.’ . . . Renters shouldn’t be second-class citizens.” (A few cities, such as New York, already offer some strong protections.) The National Multi-Housing Council advocates incentives for rental housing. In the housing stimulus bill passed last July, it successfully lobbied for increased funding for the Low-Income Housing Tax Credit program, which supports the production of affordable rental housing.

Some economists, such as Gyourko, don’t believe in subsidies for either alternative.

“I think we should have a level playing field,” he says. “There’s no reason to subsidize homeownership significantly.”

The emotional tug of owning a home can’t be discounted. But in recent years, as jobs have become less stable, environmental concerns have risen, and the costs of owning a house have become apparent, the case for renting has become more compelling. According to Eric Belsky, “People are saying, ‘Hey, it’s OK to rent.’ ” Instead of starting with a presumption in favor of homeownership, he asks, “Why don’t we help people make informed choices?”

Learning from slums

Posted on: March 1st, 2009 by Rebecca Tuhus-Dubrow

Not everybody liked “Slumdog Millionaire” as much as the Oscar committee did. Aside from slum dwellers offended by the title, some critics lambasted its portrait of life in Dharavi, the biggest slum in Mumbai, as exploitative. A Times of London columnist dubbed it “poverty porn” for inviting viewers to gawk at the squalor and violence of its setting.

But according to a less widely noticed perspective, the problem is not just voyeurism; it’s the limited conception of slums, in that movie and in the public mind. No one denies that slums – also known as shantytowns, squatter cities, and informal settlements – have serious problems. They are as a rule overcrowded, unhealthy, and emblems of profound inequality. But among architects, planners, and other thinkers, there is a growing realization that they also possess unique strengths, and may even hold lessons in successful urban development.

The appreciation can come from unlikely quarters: In a recent speech, Prince Charles of England, who founded an organization called the Foundation for the Built Environment, praised Dharavi (which he visited in 2003) for its “underlying, intuitive ‘grammar of design’ ” and “the timeless quality and resilience of vernacular settlements.” He predicted that “in a few years’ time such communities will be perceived as best equipped to face the challenges that confront us because they have built-in resilience and genuinely durable ways of living.”

He echoes development specialists and slum dwellers themselves in arguing that slums have assets along with their obvious shortcomings. Their humming economic activity and proximity to city centers represent big advantages over the subsistence farming that many slum dwellers have fled. Numerous observers have noted the enterprising spirit of these places, evident not only in their countless tiny businesses, but also in the constant upgrading and expansion of homes. Longstanding slum communities tend to be much more tightknit than many prosperous parts of the developed world, where neighbors hardly know one another. Indeed, slums embody many of the principles frequently invoked by urban planners: They are walkable, high-density, and mixed-use, meaning that housing and commerce mingle. Consider too that the buildings are often made of materials that would otherwise be piling up in landfills, and slums are by some measures exceptionally ecologically friendly. Some countries have begun trying to mitigate the problems with slums rather than eliminate the slums themselves. Cable cars are being installed as transit in a few Latin American shantytowns, and some municipal governments have struck arrangements with squatters to connect them with electricity and sanitation services.

And there are thinkers who take the idea a step further, arguing that slums should prompt the rest of us to reconsider our own cities. While the idea of emulating slums may seem absurd, a number of planners and environmentalists say that we would do well to incorporate their promising elements. One architect, Teddy Cruz, has taken the shantytowns of Tijuana as inspiration for his own designs; he is currently working on a development in Hudson, N.Y., that draws on their organically formed density.

“We should not dismiss them because they look ugly, they look messy,” says Cruz, a professor at UC San Diego. “They have sophisticated, participatory practices, a light way of occupying the land. Because people are trying to survive, creativity flourishes.”

To be sure, there is something unseemly in privileged people rhapsodizing about such places. Prince Charles, for all his praise, does not appear poised to move to a shack in Dharavi. Identifying the positive aspects of poverty risks glorifying it or rationalizing it. Moreover, some of the qualities extolled by analysts are direct results of deprivation. Low resource consumption may be good for the earth, but it is not the residents’ choice. Most proponents of this thinking agree that it’s crucial to address the conflict between improving standards of living and preserving the benefits of shantytowns.

But given the reality that poverty exists and seems unlikely to disappear soon, squatter cities can also be seen as a remarkably successful response to adversity – more successful, in fact, than the alternatives governments have tried to devise over the years. They also represent the future. An estimated 1 billion people now live in them, a number that is projected to double by 2030. The global urban population recently exceeded the rural for the first time, and the majority of that growth has occurred in slums. According to Stewart Brand, founder of the Long Now Foundation and author of the forthcoming book “Whole Earth Discipline,” which covers these issues, “It’s a clear-eyed, direct view we’re calling for – neither romanticizing squatter cities or regarding them as a pestilence. These things are more solution than problem.”

The word “slum” itself is controversial and slippery. In the United States, it is often used to refer simply to marginalized neighborhoods, but in developing countries, it usually means a settlement built in or near a city by the residents themselves, without official authorization or regulation. Housing is typically substandard, and the infrastructure and services range from nonexistent to improvised.

There is nearly as much diversity among informal settlements (a term sometimes used in preference to the more loaded “slum”) as in their formal counterparts. They include a wide range of economic levels and precariousness. In Kenya, about a million people live in Kibera, outside the city center of Nairobi. Its huts are built of mud and corrugated metal, trash is everywhere underfoot, and “flying toilets” – plastic bags used for defecation and then tossed – substitute for a sanitation system. In Istanbul, by contrast, where the city government has been more sympathetic, some squatter areas have water piped into every home.

Without some degree of government support, slums tend to be fetid and disease-ridden, and until a few decades ago, the most popular approach to solving their problems was to demolish them. In the 1960s and 1970s, Brazil, for example, razed many of its slums, called favelas, and relocated residents to government housing. But since then, a new idea has emerged in development circles: that such settlements are more than eyesores; they are the product of years of residents’ labor, and legitimate communities that should be improved rather than erased.

“One of the misconceptions is that they’re endless seas of mud huts,” says Robert Neuwirth, author of “Shadow Cities: a Billion Squatters, a New Urban World,” who spent two years living in squatter communities. “There’s a tremendous amount of economic activity – stores, bars, hairdressers, everything.”

An early reappraisal came in the book “Freedom to Build: Dweller Control of the Housing Process” (1972), edited by John F. C. Turner and Robert Fichter. Some of the contributors had closely studied squatter communities in the developing world, and the book argued that when people had autonomy over their housing and their environments, the residents and the settlements thrived. The development community began to recognize the drawbacks of evicting people and relocating them, which can be “incredibly traumatic,” says Diana Mitlin, senior research associate at the International Institute for Environment and Development in the UK. In 1975, the World Bank officially changed its position to endorse upgrading instead of new site development for squatters.

More recently, shantytowns have been reassessed in light of the growing awareness of the benefits of urbanization. Cities provide myriad economic opportunities that are lacking in the countryside, which is why millions of people stream in every month. They also offer freedom – especially, notes Brand, for women, who find greater access to jobs and education, as well as healthcare. Birthrates tend to fall when families move from villages to cities, not only thanks to family planning services, but also because more children, an asset on the farm, are a burden in the city.

What’s more, cities are increasingly seen as good for the planet. Aside from slowing population growth, they’re also more efficient in their use of resources, and allow abandoned land in the country to regenerate.

Most of these benefits, of course, would accrue even if migrants were moving to apartments in fashionable districts. But in practice, urbanization means the movement of poor people into slums. And while this reality certainly poses challenges, in the past few years, some analysts have begun to see slums as not simply the only realistic option, but as having certain advantages over formal settlements, especially the government-built high-rise projects where the poor are often housed.

Shantytowns are “pedestrian-friendly. There are small alleyways, the streets are narrow. Children can play in the streets,” says Christian Werthmann, a professor of landscape architecture at Harvard. Some frustrating parts of slum life – the close quarters and the need to cooperate with neighbors in endeavors like obtaining services – have an upside: they can contribute to a strong sense of community. And although many shantytowns are dangerous, some actually have very low crime rates. Writing recently in the New York Times, two researchers affiliated with the Indian nonprofit Partners for Urban Knowledge Action and Research defended the highly developed slum of Dharavi as “perhaps safer than most American cities,” protected by the watchful eyes of close-knit neighbors.

There is an ethos of self-reliance in communities independently built and continually rebuilt by their residents. Over the course of years or decades, residents may upgrade from cardboard to corrugated metal to brick, add floors on top of the roof. They are invested in their creations, and typically prefer them to the feasible alternatives. “When people are relocated to places where government thinks they can be housed in a better way, they often move back,” says Hank Dittmar, chief executive of Prince Charles’s Foundation for the Built Environment. Living in a legal neighborhood would usually mean more money for less space, without the prospect of improving or expanding. And it might entail constraints that don’t apply in the slums – for instance, zoning laws about where it’s acceptable to operate businesses.

Another major concern of contemporary urban planners is ecological sustainability, and shantytowns get high marks for that, too. Teddy Cruz, who has spent a great deal of time in Tijuana, says, “These slums have been made with the waste of San Diego. . . . Aluminum windows, garage doors. Debris is building these slums.”

Still, most shantytowns remain difficult and unhealthy places for people to live and grow up. They are also reviled by their wealthier neighbors, and as cities expand, sometimes they find themselves in the crosshairs of developers eager to build on their prime real estate. Some countries continue to clear slums: In 2005, Zimbabwe perpetrated brutal demolitions, called Operation Drive Out Trash, which left hundreds of thousands of settlers homeless. Dharavi is located in the heart of Mumbai, and plans have been underway to develop high-rises and high-end commercial ventures in that area. Following protests, the plans will now be reviewed by an advisory group that includes some residents.

In a number of countries, government and aid organizations have been working with squatters to retrofit slums. Brazilian favela dwellers, who are voters, have obtained concessions such as hookups to water mains and electricity. Squatters in many cities have established their own activist organizations, which work together under an umbrella group called Shack/Slum Dwellers International. Jockin Arputham, the group’s president (and head of India’s national slum-dweller organization) recalled in a published interview that years ago he led a large group of children in collecting garbage in their community and depositing it in front of the municipal council’s offices. “[W]e showed them the garbage problem in our settlement and began a negotiation,” he told the journal Environment & Urbanization. “We said that we would organize the garbage collection if the municipality would provide the truck to collect it regularly.” The gambit worked.

There is debate about whether the informality itself is a plus or a minus. Hernando de Soto, a Peruvian economist, has argued that slum dwellers should be given title deeds for their plots, in order to liberate the “dead capital” they are sitting on – to enable them to get loans from banks. But many analysts are skeptical of this proposal. One problem is that individual property rights could disrupt the stable system of communal control that has evolved in many slums. Another possibility is that residents might quickly sell their new deeds for cash, and thus lose the rights to their longtime homes.

There are also downsides to retrofitting slums. According to Ciro Biderman, a fellow at the Lincoln Institute of Land Policy, upgrading is much more expensive than building a new settlement with infrastructure in place from the outset, and amounts to a subsidy he considers unfair to poor people who do not live in slums. Another concern is that shantytowns are sometimes built on environmentally fragile terrain, such as steep hillsides or wetland areas – in those cases, helping residents stay in place can be both dangerous for the inhabitants and ecologically damaging.

Meanwhile, some observers in the developed world have been asking, what if the laudable aspects of these informal communities could be disentangled from the unfortunate parts? To build housing for low-income people, Cruz has drawn inspiration from Tijuana shantytowns for developments in Southern California, and is currently working on the one in Hudson. It will include communal porches and terraces, and spaces meant to encourage small start-up businesses – for example, providing room to store sewing machines. The intention is to integrate a poorer immigrant population into the area by creating openings for a community to evolve. He calls his vision “club sandwich urbanism – layering. It occurs through time. Our planning institutions never think about time.”

Cruz and Neuwirth say we can also learn from the spirit of collaboration in informal settlements, and their ingenuity in the use of space. Their richness suggests to some that the dominant American mode of living, for all its suburban comforts, has come at a price. Municipalities might want to reconsider zoning laws to allow residences to double as businesses, says Cruz: he imagines small enterprises being run out of garages. In Werthmann’s view, we might also emulate the low-rise, high-density model, which is conducive to neighborliness and requires no elevators.

On a more basic level, these places can teach us about where, for better or worse, urban life appears to be headed. “Squatters are the world’s dominant builders,” says Brand. “If you want to understand what’s going on in cities, look at squatters.”

Group Think

Posted on: November 23rd, 2008 by Rebecca Tuhus-Dubrow

FOR SCHOLARS — ESPECIALLY scholars who like to wear pajamas — the Internet has been a godsend. It allows instant communication with colleagues around the globe, and makes tracking down published research a matter of seconds.

But perhaps the greatest boon is the sheer quantity of readily accessible knowledge. Millions of journal articles are available online, enabling scholars to find material they never would have encountered at their university libraries. From classic psychology studies to the most esoteric literary theory, it’s all just a few clicks away.

A recent study, however, suggests that despite this cornucopia, the boom in online research may actually have a “narrowing” effect on scholarship. James Evans, a sociologist at the University of Chicago, analyzed a database of 34 million articles in the sciences, social sciences, and humanities, and determined that as more journal issues came online, new papers referenced a relatively smaller pool of articles, which tended to be more recent, at the expense of older and more obscure work. Overall, Evans says, published research has expanded, due to a proliferation of journals, authors, and conferences. But the paper, which appeared in July in the journal Science, concludes that the Internet’s influence is to tighten consensus, posing the risk that good ideas may be ignored and lost — the opposite of the Internet’s promise.

“Winners are inadvertently picked,” says Evans. “It drives out diversity.”

This study adds weight to concerns, shared by other Internet analysts, that the rise of online research has costs as well as benefits. Internet search tools are not neutral: they tend to privilege the new and the popular. And for all the frustrations of older research methods, their very inefficiency may have yielded rewards. Leafing through print journals or browsing the stacks can expose researchers to a context that is missing in the highly targeted searches of PubMed or PsychInfo. The old-fashioned style of browsing, some say, can provide academics with more background knowledge, and lead to serendipitous insights when they stumble upon articles or books they weren’t necessarily looking for.

Yet there is vigorous debate over the Internet’s effects, and the Evans research has proved controversial. A University of Quebec researcher, Vincent Lariviere, has coauthored a forthcoming paper that challenges some of its conclusions. (Evans plans to publish a rebuttal.) Another researcher, Carol Tenopir at the University of Tennessee at Knoxville, says that she has not studied citations, but that her surveys of reading patterns show the reverse of a narrowing effect.

“Electronic journals, I can say with confidence, have broadened reading,” says Tenopir.

This debate has important implications for the academic world, but it also has wider significance. More and more, the Internet dominates everyday life. Our daily experience — what we watch, listen to, and read; the people we date and the friendships we maintain — is increasingly shaped by the vast information landscape of the Internet, and how it is filtered for personal use.

Different interpretations notwithstanding, many experts agree that we have only begun to understand the repercussions of our Internet consumption. This dim awareness, despite the interactive ethos of Web 2.0, leaves us more passive than we may feel, in the grip of seismic change.

“We have an opportunity to maximize the good effects, and minimize the bad effects,” says Katrina Kuh, a law professor at Hofstra University. “We’re missing that opportunity.”

Inevitably, the discussion of these questions turns to the theory of the “long tail,” articulated by Chris Anderson, editor in chief of Wired magazine. Anderson’s argument focuses on consumer choices, positing that the Internet reduces the “blockbuster effect” — whereby consumers settle on a few big hits — and disperses attention over a wider range. Web users gain access to obscure books, movies, and other products that might have escaped their notice without the Internet, and that conventional “brick-and-mortar” stores wouldn’t find it worthwhile to stock. As a result, the theory goes, consumers can satisfy their idiosyncratic preferences rather than following the herd. This is a key part of how we think of the democratic potential of the Internet: We each find our own niche, hierarchies are abolished, and diversity thrives.

Yet in a recent article in the Harvard Business Review, Harvard Business School professor Anita Elberse challenged that premise. Examining recent music and DVD sales, she found greater concentration, not less. For example, from 2000 to 2005, the number of titles in the top 10 percent of weekly home video sales fell significantly, by more than 50 percent. Her paper concludes: “The importance of individual best sellers is not diminishing over time. It is growing.”

This view tallies with Evans’s observations about scholarship, and those of several other analysts about the Internet more generally. The explosion of online materials has two, somewhat contradictory effects. The scope of available information expands, remarkably so; but as a consequence, the information needs to be filtered somehow.

To make sense of this overwhelming sea of data, search tools must present results in some kind of order. Scholars, like other Internet users, rely on tools that rank results primarily in two ways: in reverse chronological order, and by popularity. (Google’s algorithms, for example, take into account the number of times a website is linked from other websites.)

Social Science Research Network (SSRN), the widely used Internet resource, offers lists of “top papers,” “top authors,” and “top institutions.” A paper titled, ” ‘I’ve Got Nothing to Hide,’ and Other Misunderstandings of Privacy,” by George Washington University law professor Daniel J. Solove, has hovered at the top of the rankings for months, with a total of 65,846 (and counting ) downloads. If you click on a given paper, you can even see a graph depicting the paper’s “raw score” (total new downloads) over time. On many other academic websites, it is standard to present the newest articles first.

These search tools clearly have the potential to open up research. Sean Franzel, a professor of German studies at the University of Missouri at Columbia, studies the effect of different media on scholarship. In his own experience, online searches often bring up results from minor journals he never would have thought to consult. In this way, Internet use “takes you further afield than you otherwise might have gone.” And many Internet users protest that online serendipity is certainly possible, indeed common. Just as researchers may come across unexpected articles in a table of contents, they might see intriguing but not directly relevant articles in lists of top papers and sidebars of related papers.

Even the narrowing effect Evans diagnoses can have advantages. There are benefits to sharing common knowledge and reference points. In this sense, online winnowing could restore the “water cooler” culture of which the Internet and other technologies have supposedly robbed us. In scholarship, convergence facilitates communication and progress. As Evans says, it’s “not so different from the effect of shared language.”

But several observers perceive losses as well. According to Alex Bentley, an anthropologist at Durham University in England, this tendency “makes academic research a popularity contest. My hypothesis is that the way that we latch onto ideas is going to become more fashion-based.”

Naturally, papers that are ranked high in “top papers” lists are more likely to get downloaded again, in part because it’s easiest, but also because their position enhances their legitimacy. “When people become more aware of each other’s choices, they factor those choices into their own activities,” says Evans. One threat is that these decisions can accumulate to amplify an initial choice that might have been arbitrary.

Some scholars lament other lost aspects of print resources. Indexes and tables of contents provide a context, giving a broader snapshot of the field at a given time. And after consulting them, the researcher must make a considered decision to take the next step. By contrast, in online searches, the researcher tends to follow hyperlink to hyperlink, in a journey that resembles “a plunge down a rabbit hole,” in the words of Robert Berring, a law professor at the University of California at Berkeley who has studied the impact of electronic media. “If you get to an index, a table of contents, you see the environment that surrounds it. In the culture of paper, a lot of these signals are important.”

If the narrowing effect is real, what is to be done about it? One possibility is that online tools will emerge to counteract it. Evans himself is working on developing a system that would use sophisticated statistical analysis to find papers including statements that agree or disagree with other statements, “rather than treating papers as bags of words,” Evans says.

Some experts are skeptical that innovative search tools will enter widespread use soon. Instead, they say, it’s up to individuals to be more conscious of the limitations of the current tools.

This caution has far-reaching importance. Harvard law professor Cass Sunstein has written about the ways that, contrary to mythology, the Internet can have detrimental influences on democracy, as people retreat to their virtual bubbles.

Many Internet users customize their consumption of news sources and other information in a way that fosters polarization. This, it could be argued, has elements both of the narrowing effect and the long tail. Americans seek out sources that reflect their personal beliefs, consistent with Anderson’s vision. But, akin to the narrowing Evans observes, large groups — liberals and conservatives — converge on different reference points, resulting in mutually unrecognizable versions of reality. The common lesson of all of these phenomena is to be cognizant that the tools we use affect us in ways we may not fully appreciate. We should always be searching, the findings suggest, for new ways to search.

Waste? Not

Posted on: July 13th, 2008 by Rebecca Tuhus-Dubrow

IN A WORLD of rapidly diminishing resources, there’s one we tend to overlook. It’s easy to produce and extremely abundant. But instead of viewing it as an embarrassment of riches, we’re more likely to see it as just an embarrassment.

This neglected treasure is human waste. Urine is rich in nitrogen, potassium, and phosphorus, the three main ingredients in artificial fertilizer. Feces contains these nutrients, too, in smaller doses, and the methane it produces can be harnessed as biogas, a green energy source.Yet in most cultures, understandably, the first impulse is not to use waste wisely, but to get rid of it as quickly as possible. In many rural, undeveloped areas, people simply “go” in the bush, or by the closest river. In advanced industrialized societies, we flush it away.

Both methods — and several others between the extremes — pose problems that grow more conspicuous every day. As the developing world has grown more crowded and urban, the lack of adequate sanitation has become a public health crisis. In America and other developed countries, the system works much more smoothly, but uses enormous quantities of clean water — about 4,000 gallons per person each year — and requires massive amounts of energy and money to treat the resulting sewage.

But now a growing global movement aims to make sanitation more sustainable by changing how both rich and poor countries think about human waste — recasting it as a valuable resource that is most costly when thrown away. Following a philosophy known as ecological sanitation, or “ecosan,” and fueled by a convergence of factors — the rising prices of energy and artificial fertilizer, increasing worries about food security, and concern for the environment — the push to reform sanitation has gained currency around the world, driving innovations from toilet design to farming practices. And some sanitation reformers say they are even making headway into the most vexing question: How to get people to see promise in a substance they are taught from birth to find revolting.

“There’s been a lot of resistance and disbelief that anything like this can work,” says Mayling Simpson-Hebert, a technical adviser with Catholic Relief Services in East Africa. “That seems to be changing.”

Simpson-Hebert has helped to introduce a toilet called the arborloo — in which a fruit tree seedling is planted in a waste-filled pit — to thousands of Ethiopian farmers in the past few years. Numerous other projects are underway in the rest of the developing world. And the idea has started to gain traction in the developed world as well. In Europe, recent years have seen the advent of “urine diversion” toilets, which separate the two kinds of waste in order to treat it more efficiently, among other benefits. Locally, a private school in Weston has installed flushless compost toilets manufactured by Clivus Multrum, a company based in Lawrence. And several European pilot projects have begun to experiment with vacuum-biogas toilets, which require very little water and turn waste into energy.

Not everyone shares the enthusiasm for these sanitation technologies. Skeptics point to the cost, health concerns, and challenge of changing deeply ingrained habits and beliefs. Depending on the particular kind of system, the changes could entail a different experience of the toilet, or a different attitude toward the waste, or both.

For some proponents of sanitation reform in developed countries, that’s part of the point: changing everyday behavior is going to be key to solving our ecological crises. According to Arno Rosemarin, research and communications manager at the Stockholm Environment Institute, our current “flush and forget” system makes it too easy to ignore the repercussions of waste disposal. If we are going to make meaningful changes in our environmental impact, the reasoning goes, perhaps we should start by thinking differently about the emissions that we ourselves produce.

. . .

The idea of recycling our feces and urine may seem surprising, and perhaps disgusting, but the concept is hardly new. China and Japan have long traditions of re-using human waste as fertilizer. Even in England, as recently as the 19th century, “nightmen” would take human waste from backyards to sell to farmers.

But that was before the British “sanitation revolution.” Exactly 150 years ago this summer, the river Thames in London overflowed with human waste in what was known as the Great Stink, forcing Parliament, located on the banks of the Thames, to take action. Sewers were subsequently installed, eventually resulting in major public health advances.

The flush toilet and its infrastructure have since become standard throughout the developed world. Excreta flow out of sight to a sewer system, and then to a waste treatment plant. In more remote areas, the sewage goes to nearby septic tanks that must be periodically emptied. The system’s benefits are obvious, but it also has downsides that are growing increasingly apparent.

Annually, each of us produces about 13 gallons of feces and 130 gallons of urine, which is instantly diluted into the 4,000 gallons we use to flush it. This large quantity of contaminated liquid further mixes with “greywater,” the water from the laundry, shower, and sink, tripling or quadrupling the amount of water that must be treated as sewage in energy-intensive plants. In effect, the system takes a relatively small amount of pathogenic material — primarily the feces — and taints enormous amounts of water with it. Especially in regions struggling with freshwater scarcity, many observers have come to see this system as highly inefficient. “It’s a totally insane idea,” says Rosemarin.

In this model, it’s not only water that’s wasted, critics say — it’s also the valuable nutrients in the feces and urine, notably phosphorous. Global fertilizer prices have tripled in the last year, partly due to a shortage of phosphorus, which some see as a looming crisis. Against this background, some argue that it would be folly not to capitalize on the plentiful phosphorus in human waste. In the same vein, the methane it generates has the potential to provide cheap, renewable energy.

Rose George, author of a forthcoming book about sanitation, “The Big Necessity,” says of the conventional system, “It was a solution 150 years ago and it was a very good one, but it should evolve.”

Over the past couple of decades, some measures have already begun to exploit the value of waste and improve the system’s efficiency. It has become common for treatment plants to convert some of the methane generated by sludge into biogas to partially power their own plants. Low-flush toilets and waterless urinals are small steps to conserve water. And the practice of using treated sludge — renamed “biosolids” — as chemical fertilizer has become customary in parts of the developed world. In the United States, according to the Environmental Protection Agency, about 50 percent of all biosolids are being recycled to land. The Massachusetts Water Resources Authority turns all of its sludge into fertilizer, some of which it sells commercially through a contractor and some of which it gives to communities.

But ecosan advocates assail this practice as unsustainable and unsafe. Under the current system, household waste mixes with industrial waste, including toxic materials. Although the EPA has issued treatment regulations, and the MWRA defends the safety of its fertilizer, there are concerns about the impact of sludge-derived products on soil and human health.

The most radical visionaries of this movement would apply the same principles to sanitation that we have begun to apply to other garbage in our homes. Just as we separate plastic, cardboard, and newspaper, says Rosemarin, we should separate urine, feces, and greywater.

As a first step down this road, some companies are producing new types of toilets. One idea, pioneered in Sweden, is known as urine diversion. The basic concept is that the toilet has two receptacles for the different kinds of waste. “Don’t mix what God separates,” says Steven Sugden, a research fellow at the London School of Hygiene and Tropical Medicine who has worked on sanitation projects in Africa.

The benefits of taking urine out of the waste stream are clear: Urine makes up less than 1 percent of all waste water in developed countries, but contains a huge proportion of the nitrogen and phosphorus. Those nutrients are essential to agriculture but harmful in water bodies, and removing them is the most energy-intensive part of treating waste water. And since urine is almost sterile, it can be used as fertilizer with little to no treatment.

In Europe and Australia, there have been numerous experiments with different kinds of urine-diversion toilets. The Swedish company Roediger sells one called the NoMix, with a back compartment that functions like a standard flush toilet and a front compartment for urine — essentially a conventional toilet with a built-in urinal. First developed in the 1990s, these and other urine-diversion toilets have gradually begun to be used in Sweden, and a few municipalities have taken responsibility for collecting the urine.

Vacuum toilets, much like those on airplanes, are also gaining currency. These typically require less than a quart of water per flush. A promising innovation is the vacuum-biogas toilet, in which waste is sucked into a vacuum sewer system, and then transferred to a local biogas plant. Recent pilot projects have tried this technology in settlements in Holland and Germany. Hamburg is in the planning phase for a project that would give vacuum-biogas toilets to 2,000 houses, according to Ralf Otterpohl, director of that city’s Institute of Wastewater Management and Water Protection. He says the water utility is considering converting the city to this system over the next 50 years.

Perhaps the simplest mechanism is a compost toilet, such as those made by Clivus Multrum. The toilets look normal from the outside, but inside, the waste drops into a dark hole. A ventilation system pulls air down to prevent odor. In the space below, the liquid and solid waste separate. The liquid can be used immediately as fertilizer, while the solid waste is stored for at least one year — with monthly raking and the addition of pine shavings – and then is ready to be harvested as compost.

“It works like your garden compost pile,” says Don Mills, the company’s sales director. “It’s low-tech, it’s no-tech. We’re just employing a process that is one of the essential processes in nature.”

Mills says the company sells mainly to parks, green buildings, and nature centers. A private school in Weston, the Cambridge School, recently installed the toilets in its new green building. Last year, similar Clivus toilets were installed at the Bronx Zoo, avoiding the need to build a large septic system or expensive sewerage, and saving over a million gallons of water a year.

There are, however, obstacles to widespread implementation of unorthodox toilets. Space limitations make compost toilets infeasible in most urban areas. Vacuum toilets require a different plumbing system. And there may be psychological barriers to changing habits in the bathroom.

For urine diversion, men would have to sit to urinate toward the front of the toilet. Although its proponents offer assurances that it’s easy for women to use, some critics question that assertion. “For a guy, that’s not too technically challenging,” says Eddy Perez, a sanitation specialist at the World Bank. “But you’ve got women of different sizes. It’s just pretty complicated from a human behavior, human physiology perspective.” Aiming could also pose problems for children.

Transforming the sanitation system in the developed world “can be done,” says author Rose George, “but it would basically require revolution.”

. . .

Partly due to the lack of infrastructure, it’s in the developing world where the biggest changes have so far taken place. The problems there are quite different. Due to the lack of proper sanitation facilities, diseases caused by ingested fecal matter are rampant; diarrhea, for example, kills more children than AIDS. But the advantages of the ecosan approach are similar, because a well-designed system allows people to harvest the benefits of waste. And given the poverty and food insecurity, these benefits are often more acutely felt.

A popular kind of toilet is called the fossa alterna, in which two 3-meter-deep pit latrines are dug side by side. Once one is filled, after about a year, it is sealed off, and the other one is used. Eventually, the waste in the first pit will be ready to be retrieved – after time, the pathogens die off – and used or sold as fertilizer for crops. Some sanitation experts worry about health risks: If the waste is touched too soon, the toilet could exacerbate the problems it’s intended to help solve. But a growing number of rural residents in Zimbabwe, Malawi, and other countries have started to prize the product they reap.

A more sophisticated system, used most often in urban areas, allows groups of families, as well as schools, to produce their own biogas. To create biogas, vegetable scraps and grass and human excreta are collected in a pit. They produce methane, which is captured in a tube and channeled to a kitchen stove or shower. The UN is involved in such projects in India and Senegal, among other places.

One of the most successful efforts has unfolded in Ethiopia. Starting in 2005, Catholic Relief Services introduced a toilet called the arborloo to extremely poor Ethiopian farmers. “All of the other toilet options we had introduced over the years had failed,” says Mayling Simpson-Hebert.

The arborloo is a shallow pit latrine that costs only $5. When it’s filled, the farmer plants a fruit tree seedling. The farmers are given two seedlings, one to plant in the arborloo, and another as a control. The comparison enables them to observe that the one in the arborloo grows much faster and produces more fruit. The farmers can eat the fruit and sell it on the market. Today more than 26,000 farmers are using these toilets, according to Simpson-Hebert, with strong support from the Ethiopian government.

This simple device has brought about the kind of change in thinking that reformers hope will eventually take root in the developed world.

“Some of our farmers say, ‘We used to think poop was dirty, but now it’s our gold,’ ” says Simpson-Hebert. “They won’t let their children defecate in the open. They say, ‘Go put your gold in the toilet.’ “