Has America been taking its technological supremacy for granted? Should policymakers reverse the trend of declining public support for research and development. If so, should basic science or applied R&D be given greater priority going forward? Tony Mills recently joined Political Economy to discuss.

Tony is the director of the R Street Institute’s science policy program. He was previously the editor of RealClearPolicy. He and Mark Mills recently published an excellent article in The New Atlantis titled “The Science Before the War”.

Below is an abbreviated transcript of our conversation. You can read our full discussion here. You can also subscribe to my podcast on iTunes or Stitcher, or download the podcast on Ricochet.

Pethokoukis: Your essay describes America’s R&D efforts during World War II. And during this pandemic, there’s a lot of interest in initiating a new “Manhattan Project” to develop therapeutics and a vaccine as quickly as possible. Is this World War II analogy a helpful one for current times?

Mills: In one sense, yes. World War II was the first time that the federal government became a large-scale investor in — and a producer of — scientific research. World War II research efforts were the origin of “big science” — i.e. setting an important precedent that the Apollo program and other large-scale government efforts in scientific research would follow.

But there is a more problematic way that this trope gets used, and that’s the idea that the government can “order up” scientific knowledge and direct it to solve practical problems. In a certain sense, the government clearly can and has done that. But we need to look at the earlier history and recognize the scientific discoveries that made those more practically-oriented research projects possible.

One reason those efforts were successful is that they weren’t just starting from nothing. They were drawing upon what was already a fairly large reservoir of scientific knowledge, right?

That’s right. In the mid-century, the view that was promoted by Vannevar Bush — FDR’s science advisor was that basic science was needed as the kind of reservoir for technological invention. It is really striking that the three iconic WWII-era inventions — the atom bomb, the computer, and radar — each depended on highly theoretical discoveries, going back to the 1800s or farther.

For instance, George Boole’s theoretical work in the 1840s — trying to describe the “mathematics of the human intellect” — could be applied to electrical relay circuits, which Claude Shannon figured out almost 100 years later. That laid the foundation for modern digital computing. Different people tried to develop computing machines going back many years. But the key insight wasn’t driven by any attempt to develop something practical.

There’s a similar story with radar. You can’t develop the technique for tracking objects using radio waves unless you know these radio waves exist. The atom bomb is similar: You can’t split the atom unless you know that there are atoms.

I think we’ve kind of lost our sense of how foundational basic scientific discoveries can be. We tend to think of basic science as just long-term research. But it really opens up a broader path.

Bringing this back to coronavirus, modern virology work uses gene sequencing to develop vaccines. But we had to discover the gene first for this to be possible.

What was Vannevar Bush’s vision for what government-funded science would look like after the war?

Bush believed that the government was able to do these incredible technological feats in wartime because of a strong backlog of scientific discovery. He proposed that the government stimulate scientific research while allowing scientists to set their own research agendas rather than pursuing practical objectives.

The alternative proposal at the time maintained that the government should steer this money to practical objectives. The end result of this debate was the creation of the National Science Foundation. But by the time NSF was created, there were a host of other federal agencies were spending way more on applied research and development rather than basic science. So Bush didn’t quite get what he was hoping for, and we’ve moved even further from his vision nowadays.

So what sort of scientific research does the government do, and who is doing it? And how would you describe the nature of that research?

Federal agencies fund basic applied research and product development research. The general trend is toward more applied research and development. The private sector has taken over the majority of R&D and the government’s R&D portfolio has a growing emphasis on application.

One of the lessons of World War II is that this might be a cause for some worry. If you look at the technological development in the post-war period, I think it was at least partly the result of a lot of basic research.

And the federal government is no longer the biggest single source of funding — even for basic research it’s only provides about 40 percent of overall US spending.

How much should we be spending on R&D?

I don’t think there’s any magic number. It’s more about the downward trends over time, as well as other causes for concern. Scientific organizations say that they don’t have enough money, for instance. And there are other indicators that our progress in science is slowing down. A lot of folks are worried that there’s a stagnation happening in physics, and the situation in the life sciences is not all that different. And many of the key breakthroughs in these areas took place at a time when there was substantially more federal support.

If you look at all these trends together, you see that we’re not getting as much out of scientific research as we used to, and we’re not putting as much in. This may partly explain some of the productivity problems we have generally when it comes to innovation.

Featured Publications