Article

Alzheimer’s Alert

When's a good time to diagnose an incurable disease?

A version of this article was originally published April 29, 2011 in Slate

Last week, new guidelines for diagnosing Alzheimer’s defined a “preclinical” stage of the dreaded disease. Evidently, the telltale pathology—in particular, the plaques that encroach on the brain—can be detected years, if not decades, before the patient ever forgets a familiar name or neglects to feed a pet.

The announcement renewed a debate that has flared in recent months: Since there’s no cure, critics believe that an early diagnosis of Alzheimer’s would serve only sadistic doctors, masochistic patients, and greedy business interests. They worry that Big Pharma will sell snake oil to a huge, desperate market, and that health insurance companies and employers could use the information against patients. Others, however, point to the benefits of advance notice. You might take that long-deferred trip to Antarctica, for example, or try to squeeze in extra visits to the elliptical machine. (There is some evidence, albeit inconclusive, that exercise helps stave off the mind’s deterioration.)

Complicating matters, the “bio-markers” that show up in an early diagnosis do not necessarily lead to symptoms. For unclear reasons, some brains seem to function well despite the incursion, while others succumb more readily. In many cases, patients die from other causes before the plaques wreak havoc. Given the gaps in knowledge, the guidelines stress that the tests are for research purposes only. (The idea is that studying the earliest manifestations of the disease will illuminate its genesis and ultimately yield therapies that keep symptoms at bay.) Yet some are concerned that before long, bio-markers will be used to test regular patients. What are the implications of diagnosing an incurable disease in seemingly healthy people?

Doctors, patients, and bioethicists have been grappling with this question for years. For most of human history, unpleasant and obvious symptoms indicated disease. You knew you were sick thanks to your projectile vomiting, or the searing pain in your head; or perhaps you were tipped off by the suppurating sores. To some extent we still rely on these signs of illness, but increasingly, people are notified of their disease—or their propensity for it—by lab results. While a diagnosis of a mysterious ailment can be something of a relief, a diagnosis of pathology when you feel perfectly healthy is more like a condemnation.

Take the example of the BRCA 1 and 2 genes, discovered in the mid-1990s. Certain mutations of these genes dramatically increase the risk of breast and ovarian cancer. Masha Gessen, who tested positive, explored her experience with humor and insight for Slate, and in her book Blood Matters. In this case, patients can at least take some action, though the options are hardly appetizing: Gessen chose a preventive double mastectomy.

A better analogue to Alzheimer’s is Huntington’s disease, a degenerative neurological disorder that leads to uncontrollable movements and dementia (and often suicide). If one of your parents is unfortunate enough to have this incurable disease, your chances of getting it are 50-50. Symptoms usually don’t appear until early middle ages, but genetic testing for presymptomatic diagnosis has been available for years.

After the genetic marker for Huntington’s was discovered in 1983, there were serious ethical concerns about testing, most of which now sound familiar. The tests were not infallible, so inaccurate diagnosis was a threat. Another worry was that people who tested positive would be pressured not to reproduce. Then there was the psychological impact of the diagnosis. The first imperative of medicine is “Do no harm,” and delivering such distressing news seemed like it might violate that precept.

Before the test became widely available, health care providers collaborated with patients and family members to formulate a set of standards for its use. Several patient-advocacy associations developed guidelines establishing a patient’s right to refuse the test and attempting to protect confidentiality. At-risk people met with a genetic counselor, a psychologist, and a medical geneticist for advice, and this preparation could last up to two years. Patients were encouraged to imagine their responses to various outcomes and, eventually, assimilate the results. (People who undergo testing for the BRCA genes also frequently meet with genetic counselors.)

Over the years, researchers have examined the repercussions of the tests and identified both benefits and harms. Unsurprisingly, gene-positive results lead to depression, anxiety, and isolation; worries about employment prospects; and regrets about knowing of a difficult future. But there are also upsides: the end of agonizing uncertainty, emotional connection to gene-positive relatives, and the ability to focus on the important things in life. Those who learn they do not have the mutation, of course, feel tremendous relief.

And yet, despite the parallels, Alzheimer’s is different from Huntington’s in several fundamental ways. Huntington’s is a rare disease, affecting only about 30,000 Americans. Alzheimer’s dementia currently afflicts 5.4 million, a figure that is projected to reach 13.5 million by 2050. Given those numbers, the costs of preclinical testing and counseling for anyone who is even at risk of Alzheimer’s would be astronomical. What’s more, for all its devastation, the disease usually allows its victims a long, normal life before the ugly descent into illness. The stakes of early diagnosis are simply not as high as they are for Huntington’s.

As of now, given the uncertain relationship between the bio-markers and dementia, an early diagnosis would seem to offer little of use. As the new guidelines point out, the greatest risk factor for Alzheimer’s is advanced old age—and you don’t need a brain scan or a spinal tap to tell you you’re a very senior citizen. At that point, even if you dodged the fate of Alzheimer’s, you could fall prey to another kind of dementia. In the end, we’re all bio-marked for death and decline.

Source: Slate

Return to top

All content copyright © 2004-2024 Rebecca Tuhus-Dubrow unless otherwise indicated.