With every crisis, whether a pandemic or manmade or natural disaster, politicians and pundits invariably get around to calling for a Manhattan Project. This powerful imagery of putting “big science” to work to tackle big challenges comes, of course, from the success of that World War II effort and after that its Cold War derivative, the moonshot Apollo program. The legacy of those organized research enterprises is the very idea that government can marshal scientific and financial resources to solve major societal problems.

Today’s battle against the novel coronavirus has been explicitly analogized to the effort that was required to win World War II, both in terms of the sacrifices needed from the citizenry and what it will take, scientifically and technologically, to defeat the enemy. And while there are many obvious differences between World War II and our current fight against SARS-CoV-2, there are nevertheless striking parallels too, especially when it comes to the government’s role in leveraging science and technology to address a large-scale threat. Compare, for example, the technical solutions imagined for widespread and effective surveillance and detection of viral threats to the use of radar during World War II to surveil the skies, oceans, and battlefields to detect and track the enemy. Both require the development and deployment of portable, distributed, and inexpensive tools to the front lines. Similarly, the imperative to invent a vaccine as rapidly as possibly is not dissimilar from the effort at Los Alamos: Both require rapidly transmuting existing scientific knowledge into highly specific and practical technologies.

Yet, policymakers and pundits too often draw the wrong lesson from World War II: namely, that government can simply order up scientific knowledge and direct it to solve practical problems. But what the history of World War II suggests is rather that the government was able to exploit scientific knowledge during the war thanks to a “backlog” of discoveries accumulated before the war, as Vannevar Bush, President Roosevelt’s science adviser, put it. In a similar way, short-term solutions to the technical problems we now face, from widely distributed virus testing machines to a vaccine, will likely flow from existing scientific paradigms such as molecular biology and genomics. A future “paradigm shift” in virology — on a par, say, with the development of DNA sequencing or the polymerase chain reaction or the discovery of the gene itself — will emerge from scientific knowledge we don’t yet have, from progress in basic research, to use a term that Bush helped popularize.

History’s long and complex pattern of “basic” scientific research leading, not necessarily in a straight line, to technological breakthroughs is dramatically illustrated by World War II’s three most iconic inventions: radar, the bomb, and the computer.

There is no question that government research, both here and in the U.K., led to the invention of radar (or Radio Detection and Ranging). The U.S. government poured $1.5 billion — three-quarters the amount it spent on the Manhattan Project — into the Radiation Laboratory at M.I.T. The “Rad Lab” directly employed nearly four thousand people and produced an array of radar technologies that were deployed to the battlefield to help with navigation, bombing, and aircraft and submarine detection, among other things. But radar would have been inconceivable without prior basic research. Beginning in the 19th century, physicists such as Michael Faraday and James Clerk Maxwell developed the theory of electromagnetic waves. Then, in the 1880s, Heinrich Hertz provided experimental confirmation of the controversial theory. Not long after, Guglielmo Marconi demonstrated that Hertzian waves — or radio waves, as we now call them — could be transmitted across great distances. Without these prior discoveries, British and American researchers could not have figured out how to use radio waves to locate and track distant objects — a technique that some historians say won the war, while the bomb ended it.

Similarly, the Manhattan Project, of course, led to the invention of the atom bomb. But no one could have hit on the idea without the earlier theoretical developments of nuclear physics, which grew out of very basic research in the burgeoning field of quantum physics. At the turn of the century, scientists were still debating the existence of atoms, before Albert Einstein’s 1905 papers on Brownian Motion and “light quanta” turned the tide toward atomism. Some 30 years later, Enrico Fermi and other scientists (including the often overlooked Lise Meitner) discovered that the nucleus of uranium could be split, releasing an enormous quantity of energy.

Like the bomb and radar, the first electronic computers, England’s Colossus and, shortly afterwards, America’s ENIAC, came from government projects spurred by the war. But, once again, the underlying theoretical discoveries were much older and were not driven by practical goals. A key insight came in the 1840s from English mathematician George Boole, who formulated what we now call a two-value algebra. Although the potential practical applications of logic were hardly unknown at the time, Boole was not seeking to invent anything but rather to describe what he called the “mathematics of the human intellect.” His reflections on the interrelation between mathematics and logic began when he was an adolescent, daydreaming while working as an usher at a boarding school in England. He returned to these questions years later, spurred by a public controversy between two logicians over an arcane technical problem. Nearly a century later, the American engineer Claude Shannon — sometimes called the “father of the information age” — realized that Boolean algebra could be applied to something as practical as electrical relays and switches, laying the groundwork for modern computing.

Bush had all of that in mind, and more, when in 1945 he made the case for the practical import of “basic science” in what is perhaps the most influential science policy document of modern times, “Science — The Endless Frontier.” In it, Bush argued that the government should continue its generous funding of scientific research during peacetime, but with this important caveat: “We must remove the rigid controls which we have had to impose, and recover freedom of inquiry and that healthy competitive scientific spirit so necessary for expansion of the frontiers of scientific knowledge.” He wanted to be sure the government supported basic science, not just applied research and development — precisely to enable future technological revolutions.

Bush knew firsthand the often-unpredictable ways in which basic research could prove enormously useful. Not only was he an engineer and entrepreneur, he was also the founding director of the wartime agency that oversaw all federal research, including the Manhattan Project and M.I.T.’s radar research. And as Shannon’s graduate adviser at MIT, and himself inventor of a mechanical computing device, Bush was well aware of the interplay between mathematical logic and technology. The history of modern science and technology is, in fact, replete with such examples, from Gregor Mendel’s discovery of genetic inheritance and Einstein’s (Nobel Prize-winning) photoelectric effect to Watson and Crick’s double helix — without which techniques such as medical imaging, CRSPR, or reverse transcription polymerase chain reaction (currently used to test for SARS-CoV-2) would be inconceivable. What Bush recognized was that, for the most part, such scientific discoveries do not come from directed or applied research of the sort the government engaged in during wartime but rather from undirected basic research.

Bush’s proposal was controversial in its day. In fact, his report was a counter to a rival proposal then gaining steam on Capitol Hill. Fearing industry control of science, Senator Harley Kilgore (D., W.Va) had called for a National Science Foundation that would not only fund research but also direct it toward specific, socially desirable, ends. Though a Republican and a businessman (Bush helped found the Raytheon Corporation), Bush actually shared Kilgore’s concern about private industry coopting science. He pointed out that private industry is “generally inhibited by preconceived goals” and the “constant pressure of commercial necessity.” Nor did he deny a role for government. He believed basic science is a public good and hence requires and deserves public support.

But “support” is not the same thing as “control.” In “Endless Frontier,” he called for a single federal agency, a National Research Foundation, that would fund all basic research but leave the direction of research entirely to the “centers of basic science” — i.e., mainly colleges, universities, and other non-governmental research institutions. He believed that government-directed research had its place, in particular for military purposes. But the vast majority of scientific research should, he proposed, be conducted by non-governmental institutions supported by federal dollars.

Bush won the battle, if not the war. After several legislative proposals, revisions, and a presidential veto, Truman signed a bill in 1950 creating the National Science Foundation. This was a victory for Bush’s vision of government-funded basic science (though the new agency adopted Kilgore’s nomenclature). But rather than a consolidation of federal funding for unfettered basic research outside government, which Bush had lobbied for, the post-war years saw the growth and creation of a host of other government research organizations, including the national laboratories, the National Institutes of Health, the Atomic Energy Commission, NASA, and many within the Defense Department.

Today, while the government continues to pour billions of dollars into research, federal science policy bears little resemblance to the self-governing “republic of science” that Bush envisioned. The vast majority of R&D funding now comes from the private sector, a complete inversion of the public–private spending ratio in the early decades following the war. And the vast majority of that funding is directed toward commercial uses. Even the government, which remains the biggest patron of basic science, spends far more on applied research and development nowadays.

If Vannevar Bush was right about America’s technological successes during World War II, recent R&D trends do not bode well for American science and innovation. In the aftermath of the coronavirus crisis, policymakers may well be more receptive to the idea that government should stimulate scientific research. Yet history indicates that, counterintuitively, if we want more practical “moonshots,” we should make more long-term investments in undirected research in basic science, precisely as Bush suggested 75 years ago. That may be a tougher sell in the post-coronavirus Congress, but it may be necessary to advance science and thus to help us build better weapons to fight future viral enemies.

Featured Publications