“It just so happens that your friend here is only MOSTLY dead. There’s a big difference between mostly dead and all dead. Now, mostly dead is slightly alive.” ~ Miracle Max
As you bundle up, you can’t suppress the wry grin.
First, there’s the plummeting temperatures which never seem to make it back up to seasonal lows. Then, the snow keeps coming down, and down, and down. And that old pair of long underwear in the dresser somewhere? You’re digging it out, and maybe buying a second pair. And then a third.
Orthodox. Faithful. Free.
Sign up to get Crisis articles delivered to your inbox daily
At this point, you can’t help grinning—maybe even chuckling—because the conditions outside are in such stark contrast to all those dire warnings we used to get about global warming. “Bring it on,” we say to ourselves now—but not out loud. Our shivering and senses might suggest that global warming is a bunch of malarkey, but we keep such thoughts to ourselves. I mean, we don’t want anybody thinking we’re nut jobs or intellectual cretins! No, sir!
But this isn’t a rant about climate change in any case. People way smarter than me think it’s legit, and I’ll grant them the benefit of the doubt. Still, it is difficult to take them too seriously in the winter—at least ’round these parts. Al Gore and the government, researchers and the media keep telling us one thing; our numb fingers and toes tell us another. That’s not very scientific, I know, but it is experience—it is real—and it does affect our willingness to buy what the experts are pushing.
Put another way, my cold-consciousness coming up against the global warming ascendancy represents a significant aesthetical reservation—aesthetic in the old sense of the word, denoting sensory experience as opposed to abstract knowledge. What my senses tell me seems to contradict reason—or at least the reason of those in the know. And that’s why those in the know have to keep telling us global warming is real, because our senses are telling us something so different.
This is the very same dilemma I face when considering so-called “brain death”—a phenomenon that has been in the news a bunch lately. “Brain death” is shorthand for death determined by neurological criteria, and unlike the traditional definition that relied on cardiopulmonary evidence—no heart beat, for instance, and no breathing—declaring death neurologically is downright tricky. It relies on human judgment to make a decision regarding an ailing individual’s brain function: Is it there? How much? Is it all gone? And then, there’s this kicker: Even if it is determined to be all gone (the essence of brain death criteria), the person’s heart will keep beating on its own, as long as the lungs are mechanically assisted to breathe.
We’re left, then, with a “dead” brain in a body being perfused with oxygen-rich blood by a living heart. The individual is non-responsive, but his limbs are supple, and his skin, pink and warm—it even heals when it’s wounded. Is such an individual really dead? It’s hard to believe, and that’s why the experts have to keep telling us such people are “dead”: Just like climate change, our senses tell us the exact opposite.
This surreal realm of defining death neurologically regularly leads to the kinds of absurd situations that we’ve been hearing about in recent weeks. For example, there’s the “brain dead” pregnant woman in Texas whose family wanted her taken off “life” support, but whom the hospital “kept alive” until recently for the sake of her unborn baby. And it was the reverse problem for the “brain dead” teen in California whom her family wanted to take home and care for, but whom the hospital insisted be taken off “life” support until a court intervened.
Note that I keep bracketing words and phrases with quotation marks. That’s because the semantics of brain death don’t lend themselves very well to deliberation without constant clarifications and re-definition of terms. “Life support” is a good example. Think about it: If a person is dead—really dead, “gone to meet her maker” dead—then what purpose does “life support” serve? It’s grisly, almost Frankensteinian, to consider what kind of “life” is being supported.
The quotation marks are also necessary because, even after almost a half-century of neurologically defined death, nobody is quite sure when it applies. Consequently, and not surprisingly, “mistakes” (there’s those quote marks again!) are made all the time—including, most recently, that teen girl I mentioned above. Her name is Jahi McMath, a thirteen-year-old from Oakland who suffered complications following a tonsillectomy. Declared brain dead by physicians, the hospital demanded that her plug be pulled. However, once the family finally got her home, and began providing care denied her in the hospital, she started to make definite progress—and was apparently very much alive.
We can’t blame the hospital, for Jahi was “technically” dead according to the experts—and hospitals are meant for living people, not dead ones. Instead, one can legitimately ask: Did the docs who declared her dead make a mistake? Or was she miraculously revived? Who knows?
In any case, situations like Jahi’s—and there are plenty of them—throw the whole brain death idea into doubt, and personal aesthetic experience only further undermines trust in the abstract assertions of diagnosing authorities. As a former oncology and hospice nurse. I’ve been around a lot of dead bodies, and I can assure you that there’s no question they’re dead. Cold and stiff, corpses can make you uncomfortable, but you don’t wonder if they’re “slightly alive.”
“Brain dead” bodies, though, aren’t like that, and consequently defining death has become quite controversial. The Magisterium understandably defers to the medical community when it comes to defining death, but that itself is the crux of the problem: The medical community is divided on the issue, as are Catholic bioethicists and moral theologians.
Nobody doubts the old death criteria—lack of heart beat, lack of spontaneous breathing, and other obvious, objective signs—but there’s a worrisome lack of unanimity among Catholic authorities when it comes to brain death. There are physicians and ethicists adamantly opposed to the criteria, and plenty who are just as adamant in affirming them. Yet if there’s such a pronounced lack of unanimity on such a critically important subject, shouldn’t we err on the side of caution? Shouldn’t we err on the side of life when considering a brain death declaration, especially when our senses tell us that life continues? Arguments about integrative function and proportionality aside, it seems more aesthetically fitting, more seemly—more humane, even—to allow a brain-injured person to die utterly and completely (i.e., no heart beat, no mechanical breathing) before proceeding with grieving and whatever else comes next.
So, back to the aesthetics of climate change for a moment. Our senses tell us that global warming is a farce—we feel like it just can’t be true—and we bring that sensory experience to bear when we examine the evidence, pro and con. I imagine even diehard environmentalists can see the humor in talk of global warming when the mercury stubbornly hovers around zero—it’s funny because they’re out in it themselves! But the preponderance of scientific evidence does seem to come down on the side of climate change—at least, that’s what we are told.
That being said, what I do about it—whether I heed the warnings and make adjustments to my driving and other habits to reduce my carbon footprint—is another thing altogether, regardless of what I might conclude regarding the merits of the climate change case. Global warming, as a theory, has significance for researchers, political leaders, and policy makers, but not for average schlubs like me. It’s just fodder for jokes when I’m revving up the snowblower. Even assuming global warming—and our contribution to it as fossil-fuel consuming humans—my actions today won’t have much affect on my world tomorrow. In the aggregate and over time, yes. But as an individual, right now? No. It’s cold and I have to get to work, so I’ll drive, global warming or no global warming—until some bureaucrat decides I can’t any more.
But I’m reluctant to take brain death so lightly. The aesthetics are too disturbing—and the “mistakes” too numerous—for average folks to leave it to the experts and authorities. Let me put it this way: When my teens get their driver’s licenses, there’s always that awkward moment when they’re asked for the first time about organ donation preference. They’re old enough to decide for themselves, but if they ask for my input, I tell them to skip it.
Yes, we should be generous, I tell them; yes, John Paul II instructed us to be unselfish with regards to organ and tissue donation. But most vital organs can only be transplanted from ‘brain dead’ donors who are being kept “alive” on “life support”—qualifications which seem to fly in the face of Pope Benedict’s insistence that “individual vital organs cannot be extracted except ex cadavere.” The Latin for ‘cadaver’ is left untranslated in the original as if to underscore that there should be no question whatsoever that a human body is really dead before its vital organs are removed. And he also spells out the implications for brain death declarations as well:
In an area such as this, in fact, there cannot be the slightest suspicion of arbitration and where certainty has not been attained the principle of precaution must prevail. (Emphasis added.)
With so much up in the air about this stuff—so much confusion regarding how death is defined and how those definitions are applied—I’ll avoid volunteering as a whole organ donor as long as they’ll let me.
In other words, short of a definitive declaration from the Magisterium, it’ll be a cold day in hell before I encourage my children to be vital organ donors. And that’s climate change we’re unlikely to see for a long, long time.
Editor’s note: This essay first appeared January 26, 2014 on the author’s blog “One Thousand Words a Week” and is reprinted with permission.