World War II shapes how we think about science today more than does any other historical event of the last century, except perhaps for the Moon landing. Hiroshima and Nagasaki in particular stand as images — both awe-inspiring and horrific — of the raw power of scientific discovery. They have come to illustrate the profound stakes of the partnership between science and technology, and the partnership between science and government. With our growing scientific knowledge of nature, what technologies should we focus on developing? And what is the role of government in deciding this question — in regulating scientific research and directing it toward particular ends? There are no clearer examples than the Manhattan Project, and its direct descendant, the Apollo Program, of large-scale organized research, and of the government’s effectiveness in funding it and even steering it toward technological application — a collection of practices that historians have come to call “Big Science.”

Today, policymakers commonly call for “Manhattan projects” or “moonshots” to conquer major societal and technological challenges, from cancer to climate change. It has become part of the legacy of Big Science that the public image of scientists, as historian Clarence G. Lasby put it, “has generally been that of ‘miracle workers,’” a “prestigious image [that] has been translated into heightened political power and representation at the highest levels in government.” A number of iconic technologies were invented during World War II — including not just the atomic bomb but radar and the computer — in part owing to research sponsored and initiated by the government. Thus, many now see the iconic inventions of the war as offering this lesson: To accomplish great technological feats, we need government not only to fund research but also to direct research programs toward practical goals. Furthermore, so the argument goes, we must not waste funds on undirected, curiosity-driven science; its results are too often unpredictable, unusable, and even unreliable.

As we shall see, this is flawed historical reasoning. The bomb, radar, and the computer — to focus on only three of many examples — were made possible by a web of theoretical and technical developments that not only predated the war by years or decades but that also originated for the most part outside the scope of goal-directed research, whether government-sponsored or otherwise. The scientific insights that enabled the technological breakthroughs associated with World War II emerged not from practical goals but from curiosity-driven inquiry, in which serendipity sometimes played a decisive role….

Read more here.

Featured Publications