In this hyper-polarized political year, a consensus has quietly emerged inside the Beltway. Fears that the American “innovation engine” is stalling — potentially leaving China to gain competitive advantage — have translated into bipartisan efforts to reignite the engine with big federal spending.

Unfortunately, these policy proposals almost invariably focus on technologies of the moment to the exclusion of long-term scientific research. In so doing, they overlook the crucial and characteristically unpredictable ways in which science contributes to technological innovation. By ignoring this role for science, such policies fail to address one of the biggest weaknesses of our current federal R&D regime: a bias towards applied research and development — technology — over basic science.

Last spring, a bipartisan and bicameral group of lawmakers unveiled the Endless Frontiers Act, which would invest $100 billion in emerging technology. The bill, sponsored by Sens. Todd Young (R-Ind.) and Chuck Schumer (D-N.Y.) as well as Reps. Ro Khanna (D-Calif.) and Mike Gallagher (R-Wis.), would create a new directorate within the National Science Foundation, which primarily supports basic science, to fund research in ten “focus areas” of emerging technology — from artificial intelligence and quantum computing to “advanced communications technology.” Funds would be distributed geographically across the country to universities, creating “tech hubs” to help the United States maintain its technological edge, and giving a boost to regions beyond the gilded coasts.

The Trump administration has also joined the fray. The White House recently announced a new $1 billion initiative, led by the National Science Foundation and the Department of Energy, to “ensure American leadership in the industries of the future.” The plan establishes twelve “research and development (R&D) institutes” to serve as “national R&D hubs” for “critical industries of the future.” These include the usual suspects: artificial intelligence, quantum information science and 5G communications.

These policies draw support from a growing body of research highlighting the federal government’s role in America’s technological preeminence during the 20th century. For instance, in their book “Jump-Starting America,” Jonathan Gruber and Simon Johnson point to massive public investments in science and technology during World War II. They argue that the post-war economic boom was driven in part by technological advances in precisely those domains funded by the government during the war. They conclude that similarly large public investments today would “jump-start” American innovation.

It is true that the federal government funded science and technology at unprecedented levels beginning with World War II. It is also true that these large-scale research enterprises produced, sometimes in astonishingly short order, some of the century’s most awesome technologies, including radar, the computer and the atom bomb. And these technologies contributed to the post-war economic boom. In this sense, federal R&D helped spur economic growth in the mid-20th century. Yet, attempts to model current R&D policies on those of World War II overlook the often long, winding and unpredictable pathways by which scientific discovery leads to technological invention.

Consider that the major inventions associated with World War II depended not only on federal dollars but also on countless theoretical discoveries made long before the war. There could have been no atom bomb without the discovery of atoms decades prior; no Radio Detection and Ranging (radar) without the discovery of radio waves in the second half of the 19th century; and no digital computing without the discovery of binary logic in 1847 (to name only a few). What’s more, many of these discoveries were made with little, if any, awareness of possible practical applications, or at least those particular practical applications.

In other words, it was not simply government investment in these technology areas that spurred innovation. The government’s organized research efforts during the war would not have been successful without deep reservoirs of scientific knowledge accumulated over decades. (This is not to ignore the many important technological breakthroughs prior to the war, nor that technology can also spur scientific discovery.) Ironically, this was one of the central insights of Vannevar Bush, the man who oversaw the federal government’s wartime research and wrote the influential report, “Science — The Endless Frontier,” from which the Schumer-Young bill takes its name.

The same pattern holds today. For instance, our ability to rapidly sequence the genome of a brand new coronavirus or to develop diagnostic tests using reverse transcription polymerase chain reaction (RT-PCR), flows from decades of prior scientific research. Similarly, as Kelvin Droegemeier, director of the White House Office of Science and Technology, observed in a recent op-ed, “rapid development of vaccines and promising therapeutics, such as remdesivir, are possible because thousands of scientific trailblazers in the fields of polio, malaria, and HIV advanced our basic knowledge about the immune system and the molecular biology of viruses.”

Of course, not all scientific discoveries lead to technological breakthroughs. And it is usually difficult or impossible to predict which ones will. Those that do — like the atomic model developed by Ernest Rutherford and Niels Bohr in 1913 or the 1956 discovery of DNA polymerase (without which we couldn’t use RT-PCR to test for SARS-CoV2 today) — can take decades to bear technological fruit. That’s why, historically, the government rather than the private sector has taken the lead in funding basic science.

Unfortunately, U.S. R&D has become increasingly biased against basic science over the last few decades. This is partly because the private sector — which funds vastly more applied research and development than basic science — has overtaken the public sector as the largest funder of R&D. But even within the public sector, applied research and development comprises a larger and larger share of research dollars. The result is that, for all our talk about science today, we are massively underinvesting in scientific research.

Policy proposals calling for more R&D spending are right to highlight the importance of federal funding of science. Historically, however, our country’s ability to develop and deploy innovative technologies, as in World War II and the ensuing decades, resulted from not just federal funding, but also progress in our basic understanding of nature. It is here that government should focus its efforts, rather than simply spending more money on technology. To ensure America’s scientific and technological preeminence — and to better prepare for the next public health (or other) emergency — lawmakers should prioritize basic science.

Featured Publications