I had a hand in helping to shape the policy of two hospitals for the triage of critical care resources during our present pandemic. In one case, my contribution was quite modest: The policy was largely complete, and I participated in its review and assessment. In the other case, my contribution was more robust: I wrote the policy from scratch, though drawing amply from publicly available models and discussions in medical ethics literature.
From my work, I can report the good news that, while there are differences in policies across hospital systems in the United States, they agree in rejecting the exclusion of whole categories of people from care: for example, all people with intellectual disabilities, or all people older than, say, 75. They also agree in rejecting as a relevant factor “the social worth of the sick person”—unlike, in this regard, a set of recommendations from Spain (see page 12)—though, as a matter of practice, we have already seen that famous or wealthy people, like basketball players and actors, had access to testing that was otherwise limited or denied.
Advertisement
The bad news, or at least the news that needs consideration, is that the policies of hospitals across the country, as well as the proposals in the medical ethics literature for what the policies should be, differ on the critical question of whether “expected life years” should be taken into account. In other words, they differ on whether it is appropriate to factor, for the allocation of ventilators and critical care beds, how much longer a person is likely to live beyond recovery from his or her acute illness. All hospitals want to save the most lives. Should they also be seeking to save the most life years?
Hospitals differ on whether it is appropriate to factor, for the allocation of ventilators and critical care beds, how much longer a person is likely to live beyond recovery from his or her acute illness.
[Explore all of America’s in-depth coverage of the coronavirus pandemic]
Two influential articles in The New England Journal of Medicine and the Journal of the American Medical Association answer emphatically that yes, they should. The argument is not at all that younger people are more valuable, intrinsically or even instrumentally, than older people. Instead, both articles claim that, stricken by a severe, life-threatening illness, younger people are “worse off” than older people likewise stricken because younger people stand to lose more. They risk being deprived of more years of life, and so presumably more of the goods that life has to offer, or goods that older people might have already enjoyed (going to school, choosing a profession, raising children, achieving life goals, becoming grandparents, embarking on a career and more).
As the JAMA article notes, the argument has an intuitive appeal: “The moral intuition of many people would support prioritizing a patient who stands to otherwise lose 40 years of life, compared with one with a chronic illness that will in all likelihood result in death within a few years.” Critics counter, however, that not only may our intuitions be biased and ill-informed, but they may also be beside the point. In the words of an article in the Annals of Internal Medicine, “In a pandemic, the critical question is the ability to survive the acute event, not long-term survival,” and the latter, moreover, is notoriously difficult to estimate.
Life’s stages are valuable not in themselves, but because of the goods that are normally available at different points across the life cycle.
From this perspective, patient need, prognosis and the likely effectiveness of treatment are the only relevant factors, with the upshot that seeking to save the most life years unjustifiably discriminates against the elderly. It also invites discrimination against the disabled of whatever age, since some would say they too potentially stand to lose fewer goods by dying than the non-disabled risk losing. Arguments for taking into account expected life years sometimes disguise this implied discrimination by gesturing to the value of living through “life’s stages.” But life’s stages are valuable not in themselves, but because of the goods that are normally available at different points across the life cycle.
One of the co-authors of the NEJM article is the indefatigable Ezekiel Emanuel from the University of Pennsylvania. In 2014, he published a much-discussed article in The Atlantic on “Why I Hope to Die at 75”—in brief, he wrote, because the quality of life normally begins to decline significantly after that point, inasmuch as modern “health care hasn’t slowed the aging process so much as it has slowed the dying process.” Emanuel insists at the article’s end that he is seeking simply to delineate his own view of a good life and to stimulate other people to do the same for themselves, and that he is not making a case for rationing health care for the elderly. That might be so, but the logic of his argument in The Atlantic is the same as the logic of his argument in the NEJM: The elderly lose less in dying because old age affords fewer goods.
The philosopher Meghan Sullivan points out in her book Time Biases that Emanuel assesses the value of living beyond 75 on the basis of what he values as a younger man (he was born in 1957), which is questionable insofar as our preferences and perceptions of what is valuable often change as we age. Another criticism is that Emanuel’s NEJM article conflates the objective value of a life with the subjective value of living it. In other words, it might well be the case that, for Emanuel, living beyond 75 and experiencing increased mental and physical decrepitude is of limited value.
It certainly is not obvious that the objective value of a life is equal to the goods it seems to offer in the present and promise in the future.
But that fact would not warrant a judgment by third parties that such a life beyond 75 is of limited value, whatever the person living it happens to think, and can thus be discounted relative to other, younger lives with the likelihood of greater goods on the horizon. For it certainly is not obvious that the objective value of a life is equal to the goods it seems to offer in the present and promise in the future. Many people, philosophers included, would reject that line of thought, especially when they reflect on how it would lead us to (dis)value the lives of people with profound disabilities.
If I were 75 or older, if my children were raised and independent and if my spouse did not then depend on me for her daily living, I am confident that, in case of need in a public health emergency, I would direct that critical care resources be diverted from me to younger people. In fact, I would put that stipulation in my living will and communicate it to the person vested with my health care power of attorney. I am less confident that I would make that decision if my children were not independent or if my spouse did depend on me. One way or the other, I would not want that decision to be made for me by hospital policy in the name of public health. And I am sure I would oppose such a decision for me at my current age, 47, with young children at home.
These positions of mine seem to me ethically justifiable, though the justifications would need to delve into the ethics of partiality and special obligations. The more I think about it, however, the less justifiable it seems to me for hospitals to seek to save the most life years. Instead, as other critics also have noted, such policies seem likely to create distrust of the medical profession, just when public trust cannot be jeopardized.