Much of the public debate about the coronavirus pandemic has centered on the issue of data — How much data do we have? How reliable are the data? How compatible are competing sets of data? Answering these questions is important not only for grasping the nature and scope of the pandemic but also for assessing our government’s response to it. Put simply, data are vital not only to understand the virus scientifically but also to make informed decisions about how to act.

Yet there lurks a distinct, and much more debatable belief about the role that data can and should play in science-based policy. Some have suggested that the data we have are too uncertain to justify our government’s response, or that our response should be based on risk assessment rather than uncertainty. Such arguments rightly call attention to the challenges and limitations experts face as they endeavor to understand Sars-CoV-2 and to formulate policy recommendations. Beyond this, though, such criticisms also seem to assume that data, if sufficient in volume and quality, can provide us with certainty, thus warranting action — be it imposing wide-spread social distancing measures, banning international travel, or lifting mitigation strategies.

But is certainty the right stick by which to measure our scientific knowledge — or our actions?

The importance we impute to certainty in scientific knowledge is a legacy of René Descartes, the founder of modern philosophy. According to him, scientific knowledge rests on indubitable first principles — clear and distinct ideas, he called them. To apprehend such principles, it is necessary first to purge our beliefs of any and everything that is not certain. (This is the motivation behind Descartes’ famous method of skepticism, which ends only in the alleged certainty of the cogito: “I think therefore I am.”) Once we have obtained such principles, we may deduce conclusions from them with logical rigor. The resulting knowledge is thus said to be certain because it rests on unshakeable foundations.

According to Descartes, this scientific method may be contrasted with the realm of practice. Here certainty is rarely if ever possible, and so it is often warranted, even necessary, to act without it. Imagine, says Descartes, that you’re lost in the woods without a map or compass (and you can’t see the stars). You don’t know which direction will lead you out. Since you have no way of knowing for sure which is the correct direction, the best course of action is to pick one, even if only arbitrarily, and begin walking. The important thing is to hew to the path once selected. Even though you may not have chosen the correct or the shortest route, better to stick with it than second-guess yourself or veer from the path — and risk wandering around in circles — or stand still, paralyzed by doubt, and remain in the forest forever. “In the same way,” Descartes concludes, “since in action it frequently happens that no delay is permissible, it is very certain that, when it is not in our power to determine what is true, we ought to act according to what is most probable.”

Few adhere to all the particulars of Descartes’ philosophy today. But his insistence that certainty is prerequisite for knowledge persists in popular portrayals of science and assumptions about the role that it plays in our public life. The dangers of this Cartesian legacy were emphasized in the last century by the Austrian philosopher Otta Neurath.

According to Neurath, Descartes’ error was to draw too sharp a contrast between theoretical knowledge and practical action. In science no less than in practice, we are often like Descartes’ lost wanderers, forced to begin from premises that are at best probable and to appeal to data that admit various interpretations. Or, to use Neurath’s famous analogy, “we are like sailors who have to rebuild their ship on the open sea, without ever being able to dismantle it in dry-dock and reconstruct it from its best components.” Scientific theories — even those of the “exact” sciences such as physics — are vindicated by their overall success, not by the indubitability of their first principles or base-level observations.

How, then, are we to draw conclusions? Neurath’s answer was what he called “auxiliary motives” — motives other and more than pure logic or pure observation. Considerations of simplicity (think Ockham’s razor), elegance, efficacy, and even, Neurath believed, those of ethics or politics may be permissible, even necessary, to help us make sense of and build on empirical data. It may be psychologically comforting to believe there is a simple algorithm for generating scientific certainty — and thus for sharply separating thought from action and data from judgment. But, Neurath maintained, such a belief is a mark of “pseudo-rationality.” Better to be open and honest about the many and varied motivations that go into scientific inquiry and the corrigibility of the conclusions that follow.

Whether or not you accept Neurath’s overall picture of scientific knowledge, it is clear enough that when it comes to epidemiology and related sciences — and, a fortiori, policy choices based on them — we are much more like Descartes’ wanderers. In this messy intermediate realm between pure theory and practice, it is often the case that “uncertainty is the only certainty there is,” as mathematician John Allen Paulos recently put it. If so, it would be unwise to seek certainty before deciding if and how to act. Reality will not wait around for us to acquire it.

This is not an excuse for rash action nor, necessarily, an argument for acting sooner rather than later. The point, rather, is that however we choose to act — whether or how to impose social distancing measures, say, or if and when to lift them — we are doing so under conditions of uncertainty. But that is not the same thing as saying we are acting arbitrarily or without knowledge. Data are essential to increase our knowledge and to inform our decisions. This is why it is paramount that our data be as bountiful and reliable as possible, and that we are clear, open, and honest about our margins of error. But no amount of data, no matter how good, will totally vanquish uncertainty.

Data are necessary, in other words, but never sufficient. Judgments — including ethical and political judgments — must inevitably enter into our decisions, even, perhaps especially, when they are informed by scientific data. When it comes to the coronavirus pandemic, for example, data are obviously crucial for determining whether and how to weigh public health risks against economic consequences as well as estimating what those risks are. But how could data ever settle these issues? Of necessity, such decisions must appeal to normative values that exceed what is given empirically. Consider that the very concept of a public health threat rests on moral and political determinations — value judgments — about health, safety, the public interest, and human life.

Data can never take the place of judgment. And while judging well requires data, to be sure, it depends above all on prudence. When the dust of this pandemic settles, it will not be our policymakers’ data that will be scrutinized so much as their judgments about what to do in light of the data.

Image credit:  beton studio

Featured Publications