Few archetypes have as much hold on the global imagination as that of the ingenious Yankee. This paradigm of the American inventor has been with us from the nation’s earliest days. Not only did America’s founders insist on a patent system to “promote the Progress of Science and useful Arts,” but several of them — including Benjamin Franklin and Thomas Jefferson — were themselves inventors. The United States also birthed many of the innovations associated with the first Industrial Revolution of the early 19th century, including the cotton gin, the commercially viable steamboat, and the widespread use of interchangeable parts, which became the basis of true manufacturing.

That tradition of American innovation has continued with the airplane, affordable mass-produced automobiles, communications satellites, handheld mobile phones, and the lion’s share of the internet’s key technologies. In the second half of the 20th century, Americans dominated the Nobel Prizes in the hard sciences and played important roles in key scientific breakthroughs, including building the standard model of particle physics, understanding the structure of DNA, and mapping the human genome. Entrepreneurs from every part of the country have built an economy in which the most dynamic and profitable sectors depend upon technological advances — jet planes, artificial intelligence, software, smartphones, and near-miraculous medicines — that Americans then export to the rest of the world.

The story of American success in science, technology, and invention, however, is not simply one of continuous progress, and it cannot be solely attributed to mystical cultural values, luck, or unbridled free markets. America’s technological and scientific success stems from an “innovation ecosystem” that balances a widespread license to invent with an assertive but limited role for government. In the 20th century, this ecosystem came to be defined by what Adam Thierer of the Mercatus Center calls “permissionless innovation,” as well as a strong and multifaceted university system and ample public support for basic science.

But there are worrying signs that this ecosystem is deteriorating and that American progress in science and technology is slowing. To increase the rate of innovation in America, we must return to the fundamentals that made the innovation ecosystem work.

PERMISSIONLESS INNOVATION

Though it was home to many inventors, early America was not a hotbed of scientific discovery. Many of the best-known men of science in the country’s early history — Jefferson, Franklin, and Benjamin Banneker, known as America’s “first black man of science” — were partly or wholly self-taught geniuses who came up with practical inventions, refined technologies, and popularized principles. But they made only a handful of discoveries. In fact, nearly every important advance in scientific knowledge during the 18th century and most of the 19th century — electromagnetism, evolution, radiation, cell biology, germ theory — originated in Europe.

But the ingenious Yankee is not a myth. Americans did innovate, and both geography and cultural mixing played major roles in early America’s inventions. The frontier — both an idea and a reality in American life, at least until the U.S. Census Bureau famously declared it closed in 1890 — may have necessitated and inspired inventions that simply had not been needed as urgently before or elsewhere. Eli Whitney’s cotton gin, the first invention mentioned in most American history textbooks, enabled the production of short-staple cotton, which could not grow in most of Europe’s climate. And as the late historian Thomas Hughes explains in American Genesis: A Century of Invention and Technological Enthusiasm, 1870-1970, the end of slave labor in the South, combined with westward expansion, spurred the need for industrial innovation.

This paved the way for a generation of mostly self-educated American inventors motivated by profit and a desire to innovate. Unlike many countries in Europe, America also had the advantages of a high rate of immigration and robust cultural exchange. Throughout most of the 18th and 19th centuries, people from Europe, Asia, and (through the horrors of the slave trade) Africa arrived in the United States, bringing with them local cultural knowledge that did not spread elsewhere.

The most important factor contributing to America’s innovative success, however, is somewhat underappreciated by scholars of technology: permissionless innovation. The fact that new inventions often disrupted existing orders, and sometimes even resulted in bona fide negative externalities, rarely concerned policymakers in the 19th century. That is to say, the U.S. government didn’t regulate innovation pre-emptively. Samuel Slater, an early-19th-century textile magnate who memorized the designs of English mills, and Hannah Wilkinson, his wife and also an inventor, did not have to acquire permission from the U.S. government to build their own mills in America or deal with a guild that would stop them from replicating British mill designs. Likewise, Slater was able to dam rivers and, in some cases, create whole new villages because nothing in the law stopped him from doing so. Later in the century, Thomas Edison, America’s most famous inventor, benefited from a legal environment similarly conducive to experimentation and innovation: He could patent innovations but did not have to seek permission to develop them.

Permissionless innovation did not mean anarchy. In fact, it was possible in part because it existed within a legal framework that rewarded innovation. America’s patent system required disclosure of inventions in return for limited monopoly rights. This had the effect of expanding knowledge rather than allowing it to remain the property of certain guilds. In the process, it contributed to what economic historian B. Zorina Khan calls the “democratization of invention”: the “offering [of] secure property rights to true inventors, regardless of age, color, marital status, gender, or economic standing.”

Nor did permissionless innovation mean externalities went entirely unheeded. Eventually, Slater’s mills were regulated in several ways, as were the fruits of Edison’s electrical inventions, along with the railroad and telecommunications industries. But the initial innovations that sparked these new sectors of the economy took place without “permission.” Regulation, for the most part, remained a step behind technology; laws to regulate new technologies were implemented only afterexternalities became apparent. America’s permissive regulatory regime helped it become one of the most innovative countries in the world for its first century or so, even though it was decidedly inferior (in all but a few cases) to Europe when it came to producing cutting-edge science.

THE UNIVERSITY SYSTEM

This would change, however, when American universities focused on research were founded during the second half of the 19th century. There had been colleges in America since colonial days, of course, but it was not until the latter half of the 19th century that modern American universities emerged in full force.

American universities originated from four main sources. Some, such as Harvard, Princeton, and Yale, began as prominent private colleges founded to educate clergy. These schools — along with newer private colleges founded in the 19th century, such as Stanford, Duke, the University of Chicago, and many Catholic colleges — eventually expanded their missions to include science and engineering. The sciences were often separated institutionally from engineering and were included in the liberal arts, along with theology and classics. Other universities, such as the Massachusetts Institute of Technology and Johns Hopkins University, were founded on the model of European polytechnic universities and “Humboldtian” research universities (named after Wilhelm von Humboldt, founder of the Humboldt University of Berlin). These research universities combined the study of the arts and sciences with the specific goal — in the words of Daniel Coit Gilman, Hopkins’s first president — of “the encouragement of research.”

Around the same time, major public universities (and one private university, Cornell) were founded under the Land-Grant College Act, or Morrill Act, of 1862 — one of Abraham Lincoln’s key campaign promises and a central part of his plan to unify the country around a system of skilled, free labor. Obligated to have departments of agriculture and mechanical engineering as well as programs for military training, most of these schools were more pragmatic in their offerings than the universities that arose out of private colleges. Finally, there were teachers’ colleges or “normal schools,” such as the institution now called Illinois State University, which typically admitted both men and women and over time became major universities in their own rights.

Although many of these institutions were elitist and exclusionary — educating only a small percentage of American adults, almost all of them white and disproportionately male — the overall system of American higher education was strikingly diverse and multifaceted in several important respects. These institutions supported an enormous range of knowledge, from forestry, irrigation, and hotel management to military engineering, mathematics, and medieval literature. There was also great diversity among the institutions. There were massive land-grant universities and small liberal-arts colleges, religious schools with a conservative bent, and institutions dedicated to secular learning and progressivism. Some universities, such as Johns Hopkins and Caltech, emphasized graduate research, while others such as Williams College focused almost exclusively on undergraduate education. The result was a plethora of places and contexts in which Americans could pursue knowledge.

This diverse system of higher education arrived on the American scene at just the right time to make a major difference in the fields of science and technology. The end of the 19th century and beginning of the 20th witnessed the second Scientific Revolution, including what historian Edwin Layton has called the “scientific revolution in technology.” Breakthroughs in black-body radiation, spectroscopy, interferometry, thermodynamics, statistical mechanics, speed-of-light measurements, and physical chemistry — to which American scientists Albert Michelson, Josiah Willard Gibbs, Henry Rowland, and others made important contributions — laid the groundwork for quantum and relativistic physics, as well as modern chemistry. These new branches of science were more dependent on increasingly complex forms of mathematics, domains that were ever more distant from immediate sensory experience. This spurred the need for a class of professional scientists, which the universities trained.

These scientific advances were made possible in part by new technologies that expanded scientists’ powers of observation. Meanwhile, technology itself was becoming more professionalized and scientific. As Layton puts it,

The technological community, which in 1800 had been a craft affair but little changed since the middle ages, was reconstructed as a mirror-image twin of the scientific community….For the oral traditions passed from master to apprentice, the new technologist substituted a college education, a professional organization, and a technical literature patterned on those of science.

It was not long before industry began recruiting professional scientists and engineers — who had been educated at universities like Caltech, MIT, and Johns Hopkins — to staff newly created commercial laboratories, such as Edison’s General Electric Labs and those at Westinghouse and DuPont. Following commercial laboratories in Germany, these dedicated internal departments of industrial research exploited the new sciences of electricity and physical chemistry, and applied the new techniques of modern engineering to solve practical problems and develop products.

Science and technology, though separate communities of inquiry with distinct professional and educational institutions, had become interpenetrating and mutually reinforcing enterprises, supported and sustained by America’s universities. Yet while the utility of science and engineering was becoming more apparent to industry, they both remained largely private affairs throughout the 19th century.

GOVERNMENT SUPPORT

World Wars I and II marked a major shift in the government’s relationship to science and technology. After the sinking of the Lusitania, President Woodrow Wilson formed the Naval Consulting Board (led by Edison) and the National Research Council (led by American astronomer George Ellery Hale) to marshal the country’s scientific and technological resources in service of military preparedness and defense. This was not the first time the federal government had called on scientific and technological expertise (the National Academy of Sciences was created by President Lincoln during the Civil War), but science and technology played a larger and more devastating role in the First World War than ever before.

Hale saw an opportunity in the government’s newfound interest in science. By war’s end, he had secured an executive order from Wilson that guaranteed the National Research Council would outlive the war; this established a permanent place for science in the public square to secure America’s technological advantage. Hale’s arguments for the necessity of government support for science prefigured those made by another pre-eminent American man of science: Vannevar Bush.

Like Hale, Bush had the ear of a president and had led a wartime council, the purpose of which was to put science in service of national defense. He would go on to make what is perhaps the best-known argument for the practical benefits — even necessity — of scientific research. Bush’s report, Science: The Endless Frontier, was commissioned by President Franklin Roosevelt and delivered to President Harry Truman.

Bush, too, sought to leverage government interest in using scientific expertise for the war effort into sustained support for science during peacetime. But he went beyond Hale, unabashedly advocating an assertive role for the public sector. Bush saw a particularly prominent role for government in promoting “basic research” — a term he helped popularize — which “results in general knowledge and an understanding of nature and its laws.” This kind of research, he wrote,

creates the fund from which the practical applications of knowledge must be drawn. New products and new processes do not appear full-grown. They are founded on new principles and new conceptions, which in turn are painstakingly developed by research in the purest realms of science.

This description of the relationship between science and technology is often taken to be the source of the so-called “linear model” of innovation, in which basic scientific research precedes applied research and development and eventually production of a new good or process.

Bush hardly advocated a government takeover of science, however. Government should directly support such basic research, he argued, beginning with the creation of a new “independent agency devoted to the support of scientific research and advanced scientific education alone.” But he asserted that most applied research and development should be private with exceptions for “military problems, agriculture, housing, public health, [and] certain medical research,” as well as for capital investments in major facilities the private sector could not afford.

Furthermore, even government’s role in basic science should be limited. He argued vehemently against government control of researchers and excessive bureaucratic oversight. In his words, the agency “should promote research through contracts or grants to organizations outside the Federal Government. It should not operate any laboratories of its own.” Moreover, the “control of policy, personnel, and the method” ought to be left to the scientific community and America’s many and diverse institutions of research and higher education.

While never fully implemented in all its details, Bush’s report left an indelible mark on science policy after the war. In addition to the 1950 creation of the National Science Foundation, partly modeled on Bush’s proposed National Research Foundation, presidents and legislators of both parties began making long-term commitments to research that allocated billions of dollars, just as Bush suggested. The era of “Big Science” that began with and continued after the Second World War — national laboratories and expensive, high-profile scientific and technological projects, such as the race to the moon, the internet, and the Human Genome Project — was in crucial respects a legacy of Bush. This is somewhat ironic given the relatively limited role he envisioned for government. Still, the postwar alliance between government and science was effective at establishing American scientific leadership during a time when the American economy was booming relative to the rest of the world. American scientists received Nobel Prizes in the hard sciences and became global leaders in the production of frequently cited patents, useful inventions, and breakthrough pharmaceuticals. If in the late 19th century America lagged behind its European counterparts in scientific achievement, it had by the postwar period more than made up that deficit.

STALLED PROGRESS

America remains a global leader in science and technology, but there are distressing signs that progress is slowing. Consider physics, perhaps the science most responsible for the modern technological era: Although the number of doctorates in physics has increased by leaps and bounds, many other metrics in the field are trending downward. A survey of physicists by Patrick Collison and Michael Nielsen, published in theAtlantic, shows that those in the profession believe the most significant discoveries happened between the 1910s and 1960s. As physicist Lee Smolin has argued in his book The Trouble with Physics, there have been no fundamental changes to the standard model of elementary particle physics established in the early 1970s. There are other signs that progress has stalled: Only three Nobel Prizes in physics have been awarded for work done after 1990.

Researchers in the biological sciences writ large have made some important advances in recent years — gene editing and the sequencing of the human genome are huge accomplishments — but there are worrying trends here as well. As in physics, many of the truly foundational discoveries of the life sciences, such as evolution through natural selection and the identification of the structure of DNA, occurred during the 19th or 20th centuries. Research spending on health topics has expanded by every possible measure, but the average age at which health scientists receive their first grant has risen from the mid-30s to 43; this increase has occurred despite a finding from the National Academy of Sciences that a significant number of groundbreaking discoveries are made by researchers in their 20s and 30s. Progress in health indicators also shows signs of reversing on a global level. A special issue of the journal Lancet published in late 2018 found that the global non-fatal disease burden is increasing while the rise in life expectancies is slowing. In the United States, a variety of causes (most prominently the opioid epidemic) have even decreased life expectancies somewhat.

What about technology? Despite important advances in consumer electronics, data analytics, and communications technology, worker productivity growth — which has historically been driven in large part by technological advancement — is sluggish. According to the Department of Labor, the economic expansion that started in 2007 has tied a record for the slowest productivity growth in the post-World War II era.

In sum, despite popular notions about sweeping scientific and technological changes and the high esteem in which the general public continues to hold scientists, ours is not an age of dramatic progress. Though some scientific and technological advances have been achieved in the last several decades, they do not amount to a revolution.

America may still lead the world in science and technology, but progress is slower than it was before, and other countries may eventually figure out how to surpass American scientific advancement. Fixing this state of affairs will require a return to the fundamentals that made America a world leader in science and technology in the first place: permissionless innovation, a multifaceted university system, and public support that emphasizes basic research.

NO PERMISSION NEEDED

Permissionless innovation is essential to technological progress. If those with vested interests must authorize every innovation, the system will have a natural bias against anything that might pose a danger to those interests. While interested parties are sometimes driven by sympathetic or even admirable motives, such as protecting public safety or the environment, they can also be motivated by protectionism or rent-seeking.

The so-called “precautionary principle” — popular on the American environmental left and standard in the regulatory framework of the European Union — holds that governments should ban any potentially dangerous activities until all reasonable doubt can be removed as to their safety. Evidence that we are moving away from permissionless innovation and toward the precautionary principle abounds. Consider ride-sharing companies, which offer a much better version of a badly needed service. From their earliest days, they have had to fight tooth and nail for the legal right to operate and have been banned outright in certain places. Efforts to make serious use of massive data troves (big data), cutting-edge biotech innovations, and artificial intelligence face similar barriers, or will in the near future.

Some technologies are regulated even before they become commercially viable. For instance, states such as California have threatened to impose restrictions on autonomous vehicles, which would make it far more difficult for these technologies to develop. Other bans may seem harmless or even laudable, such as the bans on “internet hunting,” which involves using remote-controlled guns to shoot captured animals. The idea is offensive to most people, and Texas promptly banned the practice when one such operation was launched in the state. Despite there being next to no expressed interest in such a service, 38 states have already banned it. Whatever the motivation, the general trend toward impeding the emergence of new technologies, rather than regulating them when negative externalities have been shown to exist, threatens America’s innovative advantage.

There is also reason to believe that our patent system, which has contributed greatly to America’s innovative success, is in need of reform. The Patent and Trademark Office has become too liberal in its granting process, resulting in many bad or useless patents. For instance, Theranos, the much-hyped Silicon Valley startup whose founder is now accused of fraud, secured hundreds of patents despite not creating any useful technology and running nearly all of its tests on existing machines. Meanwhile, misguided court decisions and high litigation costs mean that the patent system, intended to reward innovation, is increasingly being used by industry players to keep out new entrants or to erect other barriers to invention.

New technologies should be “innocent until proven guilty,” and the burden of proof should be on those who claim the new technologies pose risks. To put this into action, we should expand the use of “sunrise” reviews that examine the potential impacts of a new regulation before it goes into force. These are currently applied to professional licensing in 14 states and to significant federal regulations through the Office of Information and Regulatory Affairs. Similar bureaus also exist in some states. Ideally, such review should take place independently of whatever government body is responsible for the relevant regulation. At the federal level, for example, Congress should have its own capacity to review new regulations proposed by administrative agencies; a Congressional Regulation Office, as advocated by Philip Wallach and Kevin Kosar in these pages, should be created for that purpose. Those states that do not already have a legislative-review process for regulations might consider doing the same.

In some cases, legislators may want to go one step further and, to at least some degree, hold promising new technologies harmless from liability. An example worthy of emulation here is Section 230 of the 1996 Communications Decency Act. Inserted into the U.S. code when the consumer internet was young, CDA Section 230 establishes the principle that operators of internet forums and search engines may develop their own processes for moderating content to keep out material that is obscene, illegal, or simply disagreeable; however, the internet companies themselves are not deemed “publishers” and thus are not responsible for material that others create. The balance provided by this law was crucial for the success of nearly all the large global internet companies we know today. And it is no coincidence that all of them are American: Few other countries passed laws offering this kind of legal protection.

For emerging technologies that are not yet in widespread use and that pose significant ethical challenges or potentially grave externalities, independent study and investigation may be in order. George W. Bush’s President’s Council on Bioethics, although lacking direct regulatory power, was convened to offer deep thinking and advice to the White House on emerging biomedical science and technology. Similar councils, such as Congress’s now-defunct Office of Technology Assessment, may be needed to advise policymakers on other emerging sciences and technologies. Once a technology is in widespread use, permanent entities with the mission to investigate rather than prohibit — such as the National Transportation Safety Board — can serve a similar function on an ongoing basis without needlessly stifling innovation.

DIVERSITY AMONG INSTITUTIONS 

Nearly all modern institutions of higher education are, internally, far more diverse than they were in the past. Rather than being bastions of white male privilege, they are now among the most inclusive institutions in society, at least when it comes to ethnicity, race, gender, and sexuality. More than half of college graduates are women; Asian-Americans attend and graduate college at higher rates than whites; Hispanics are about as likely to attend college as whites; and African-American rates of college attendance are higher than they were for whites in the not-so-distant past. While some disciplines, including the hard sciences, remain heavily male and disproportionately white and Asian, millions of talented people who would have been denied opportunities in past generations have been able to attend institutions of higher education and make important innovations in the sciences.

But while institutions themselves have diversified, the diversity that once existed amonginstitutions has eroded. Institutions like George Mason University and High Point University, which were once small colleges, have become major universities with impressive science labs, graduate schools, and ambitions to grow even larger and more prestigious. Universities that began with narrow missions, such as the University of California, Davis (originally an agricultural branch campus of the University of California, Berkeley), have broadened their curricula to include nearly every field of human inquiry. For those who run these institutions, such growth is a triumph. Certainly, some of these changes have been for the better and have created more opportunities for students.

Not all institutions have been able to keep up, however; several small and lesser-known colleges have closed in the last half-century. Many more remain financially distressed and hang on by offering vocational training and programs for non-degree-seeking adults. Successful universities have come to be characterized by what sociologists call institutional isomorphism: The structures or processes at these institutions increasingly resemble one another. This one-size-fits-all approach to higher education is bad for scientific progress and innovation.

Indeed, as science spending has risen and the size of the average federal grant has grown, more and more science work has become inaccessible to all but the largest and wealthiest institutions. This in turn leads more institutions to scale up in order to compete for research dollars, thus further homogenizing the sector. This trend was identified as far back as 1961, when Alvin Weinberg, director of the Oak Ridge National Laboratory, warned fellow scientists that “Big Science” could “ruin” science. As he put it,

[O]ne sees evidence of scientists’ spending money instead of thought. This is one of the most insidious effects of large-scale support of science. In the past the two commodities, thought and money, have both been hard to come by. Now that money is relatively plentiful but thought is still scarce, there is a natural rush to spend dollars rather than thought.

There is no question that universities play a vital role in advancing research in the scientific and engineering fields, as do large-scale institutions such as national labs. But research universities must take care not to allow chasing dollars and publicity, or managing large grants, to stifle creativity or divert resources from their core mission: conducting research and educating students. The competition for research dollars and prestige among universities not only increases the number of non-essential “scientific administrators” at these schools, but also crowds out contributions from the smaller institutions that comprise the world of “Little Science,” which still has an important role to play in research.

Policymakers should recognize that though big is good, it is not everything, and trends that encourage every school to become a major research university must be resisted. Smaller and less-well-known institutions can contribute (and have contributed) greatly to the sciences, not only by providing a more diverse marketplace for ideas, but also by educating the next generation of scientists. Indeed, a major report from Oberlin College found that small, selective liberal-arts colleges produce more scientists, on average, than their larger peers. People and places far outside the “Big Science” world also make important advances. Florence Bascom — one of the most important American geologists of the first part of the 20th century, by any measure, and the first woman to earn a doctorate from Johns Hopkins University — spent most of her teaching career at the elite but tiny Bryn Mawr College. And she was tremendously effective there: Students who had taken her courses became an absolute majority of the female members of the Geological Society of America. The achievements of “Little Science” continue to this day: New Orleans’s Xavier University, a historically black school of about 3,000 students with several excellent professional programs, recently developed promising new methods for treating breast cancer.

Rebuilding and sustaining such diversity will be difficult and goes well beyond science policy per se. Though it is impossible to directly regulate the way that private schools choose to govern their affairs (although the structure of federal research funding may offer them guidance), a first step may be to encourage state legislatures to use government support for public institutions as a tool to guide these colleges and universities toward internal distinctiveness. Legislators and governors might also want to consider reviving the almost-defunct “upper division colleges,” two-year institutions created primarily to teach those who have completed community college. Likewise, smaller campuses with great ambitions could be encouraged to emulate colleges such as Miami of Ohio or St. Mary’s in Maryland — excellent public liberal-arts colleges focused mostly on undergraduates — rather than trying to transform themselves into one-size-fits-all research universities. And instead of encouraging shiny new programs and initiatives that distract from these institutions’ core missions, state legislatures should do what they can to encourage college administrators to improve existing programs and build on areas of excellence.

The goal should be, as Weinberg put it, to “make Big Science flourish without, at the same time, allowing it to trample Little Science — that is, we must nurture small-scale excellence as carefully as we lavish gifts on large-scale spectaculars.” Moreover, it is essential that America’s system of higher education maintain pre-eminence in a wide range of fields. This is especially important in both science- and technology-related fields, which feature separate but mutually interactive research communities. As historian and philosopher of science Thomas Kuhn once observed, few nations have “managed simultaneously to support first-rate traditions in both science and technology.” For those that have, such as Germany in the 19th century and America in the 20th, the “institutional separation” of science and technology “is a likely cause of that unique success.”

A RETURN TO BASIC SCIENCE 

Bush’s model of basic-research support has come under fire since The Endless Frontier‘s publication more than 70 years ago. Indeed, the debate began two decades later, with the publication of studies such as the Department of Defense’s Project Hindsight (which criticized basic science) and the National Science Foundation’s Technology in Retrospect and Critical Events in Science (which defended it). The upshot of these debates was that the relationship between science and technology is perhaps more complex than many had realized. This has led many in the science-policy community to reject Bush’s ideas, especially in recent decades.

Some in the science and technology fields, such as former Harvard University engineering dean Venkatesh Narayanamurti, argue that basic and applied research feed on each other in a self-reinforcing way and that separation undermines real progress. Others, such as Daniel Sarewitz of Arizona State University, have rejected the very idea of curiosity-driven research, asserting that it produces self-referential scholarship and undermines the aims of science. In place of the Bush model, these thinkers argue that scientific institutions — and the federal programs that support them — should instead focus on “missions” to solve specific practical problems facing society, throwing every available tool at them. Some in political life go even further. Senator Elizabeth Warren has proposed taking partial public control of some of the biggest companies in the private sector in order to, among other things, stimulate and direct innovation toward socially desirable outcomes. But these arguments miss the mark.

The truth is that declining relative support for basic research, coupled with large increases in support for applied research and development, has hurt science. Since the late 1980s, and despite large overall increases in federal support for research and development by all measures, a falling share of federal money has been allocated to basic research. And though real-dollar government spending on basic research has increased during the same period in absolute terms, spending on applied research and development has increased at a much greater rate. Many of the best students are attracted to the fields that are best funded in relative terms. Moreover, research spending on health is crowding out spending in other areas — especially the more theoretical disciplines such as physics and chemistry — while the recipients of government research dollars are, as a group, becoming older and increasingly tend to represent larger, wealthier institutions. And industry has not picked up the slack. While the vast majority of research and development spending in the United States comes from industry, less than 5% goes to basic research.

There is a simple, three-part solution to this problem. First, a larger share of non-defense federal funding of research and development should go to basic research. This should be done because curiosity-driven scientific inquiry is both intrinsically and instrumentally valuable: It not only promotes the advance of scientific knowledge, but it can also be useful and beneficial, at least in the long run. It is currently being underserved by the marketplace, however, in part because of the long time scales it requires.

Basic science is a public good. In economic terms, it is non-rivalrous and non-excludable in the same way clean air and national defense are. There is no substitute for public support, or any viable way for the private sector to provide these things. This is not to discount the importance of engineering or applied research and development as sources of innovation. But it is to recognize that many of the most transformational technologies of the 20th century — radio, radar, nuclear energy, computing, the transistor, GPS — depended in some way on theoretical insights from physics, chemistry, logic, and mathematics, in many cases arrived at without any regard for future applications. For example, 95 years elapsed between Albert Einstein’s first paper on relativity in 1905 and its widespread application in the form of full consumer access to GPS in 2000. There is every reason to expect that at least some transformational technological innovations will follow a similar pattern in the future.

Second, new federal funds that do go to basic research should prioritize hard sciences. In recent decades, health-related fields have consumed a growing majority of the scarce federal dollars devoted to basic research. The budget of the National Institutes of Health has quadrupled in the past 25 years, spurring growth in the life sciences and many other important advances. The life sciences deserve continued (and perhaps even greater) support, but a larger share of federal funds should go to other areas of study, such as physics, astronomy, chemistry, and closely related fields. We are long overdue for genuinely revolutionary insights in our most fundamental theoretical disciplines. This is unlikely to happen without more resources.

Third, federal support for basic research should be decentralized and diversified. The current model is almost the inverse of Bush’s model: He envisioned a single independent agency, staffed by scientists, that would disperse funds widely across a range of nongovernmental institutions, and he assumed these institutions would retain autonomy and control over their research agendas and personnel. What Bush recognized is that science is best conducted in a vibrant marketplace of ideas with minimal bureaucratic oversight.

What is needed today, then, are more and smaller grants dispersed among a wider array of recipients pursuing basic research; a smaller percentage of funding should be allocated to a handful of very large institutions. This can be accomplished in part by devolving more basic research to nongovernment institutions, especially smaller universities and colleges. Funding should be directed toward a wider range of educational and research institutions, as well as to younger primary investigators (a few programs already exist to support this) and high-risk, high-reward projects. Additionally, money that currently supports postdoctoral fellowships might be used to set up small labs in places where they do not exist: liberal-arts institutions, community colleges, perhaps even some science-oriented high schools or new free-standing research labs.

Public research funding is not a zero-sum contest between science and technology. On the contrary, progress in basic science increasingly depends on progress in and interaction with technology and applied research, as critics of the “linear model” rightly point out. Technology and science, basic and applied research, academic and industrial research — these are all distinct but interrelated modes of inquiry. The diverse research communities and institutions that have sustained them for the last century should enjoy more cross-pollination, not less. We need to ensure that all these types of research have the support they need to flourish, rather than the current practice of enabling practically oriented research to eclipse basic research in both government and industry. There is no strong case for cutting applied research in any significant way; it accomplishes much good, and a large share of the government’s applied research deals with intrinsically governmental functions such as national defense and environmental protection. But a larger share of total research spending should go toward basic science.

SUSTAINING INNOVATION

A stronger, better basic-research establishment, bolstered by a multifaceted university system and a culture of permissionless innovation, is of great public utility. While it is true that other countries may benefit from American scientific capital, the United States is uniquely well-positioned to do so as well, for many of the reasons that our country became innovative and inventive in the first place. We still have a more diverse university system and a more permissive culture of innovation than our geopolitical rivals. Moreover, science and technology have become more interpenetrating (which is not to say identical) enterprises in the last century and a half. Reaping the “technological harvest from science,” as political scientist Donald Stokes puts it, thus requires pre-eminence in both science and technology, as well as a social and legal environment conducive to innovation. History shows that countries able to achieve these conditions are the exceptions rather than the rule. Now more than ever, we need public policies that cultivate and sustain every aspect of America’s powerful, yet fragile, ecosystem of discovery and innovation.

Image credit: PopTika

 

Featured Publications