The recent introduction of ChatGPT marks a new era in the history of computation, with a rapid expanse in complex new artificial intelligence (AI) models pushing the boundaries of what computers can do. For some, this opens the door to exciting new advances that hold the promise of a better future. For others, alarm bells are going off, with fears over everything from lost jobs to malevolent AI as an existential threat to humanity. Further concerns focus on the infrastructure of an AI-based economy, such as the environmental impact of new energy-intensive computing methods. While it is prudent to assess the emerging model of AI computing, analyses must examine the relevant tradeoffs in order to determine both the costs and benefits associated with the AI computational revolution.

This is the first in a series of “Real Solutions” posts that will explore some of the relevant questions around AI as the new standard of computing.

The computer era began with the large mainframe computers introduced in the 1950s that remained the mainstay of businesses until the 1980s. The desktop then introduced a new generation of personal computers that were smaller, cheaper and easier to use. These new computers quickly gained popularity, dominating both business and the new market for personal use. In the early 2000s, the internet and cloud computing provided a new means for businesses and individuals to access additional computing power through online storage, online applications and more. Today’s AI revolution is founded on accelerated computing—the latest advance in computer technology that is driving the next generation of computational power, with rapid advances in artificial intelligence, machine learning and real-time data analytics.

Accelerated computing relies on specialized hardware, known as accelerators, to speed up the execution of certain tasks. Specialized hardware, such as graphical processing units (GPUs), work in tandem with traditional central processing units (CPUs). Highly efficient at graphics processing, GPUs have become critical in machine learning and other applications. A series of GPUs can be used in parallel to substantially enhance computational power. Housed in massive data centers in conjunction with servers, routers and data storage, GPUs open the door to computational feats well beyond the scope of the traditional computer.

The whirlwind introduction of ChatGPT offered the public a striking demonstration of this new world of AI and the power of accelerated computing. Yet that is just a hint of AI’s capabilities, as the increase in computational power has implications for virtually every sector of the economy. For example, new models developed for drug discovery can cut the cost and time required to identify new drugs and also create more personalized treatments. Likewise, AI-driven changes in the financial sector are transforming banking, stock trading and insurance. And AI can enhance the accuracy of weather modeling to improve public safety as well as generate more accurate predictions of crop yields. The power of AI can also contribute to major policy questions—for example, by developing more precise climate models.

Yet this advance in computing power is not without cost. It is energy-intensive, with large data centers housing the necessary network infrastructure and computing power. But data centers place new demands on the electricity grid, and these demands will increase as AI is adopted more broadly. Data centers can be massive structures housing GPUs and other hardware necessary for training and running AI models. The 350 East Cermak data center in Chicago spans over one million square feet—the size of 23 football fields—and the Citadel in Nevada is poised to be the largest data center in the world, covering an expansive 7.2 million square feet. Powering these critical data centers is not a trivial problem. Estimates suggest that data centers consume up to 3 percent of the nation’s power. This has been targeted by some for its environmental impact as well as the challenges it poses to grid management. The rapid adoption of AI may put even greater demands on the grid, making these concerns an important policy question.

Such issues have contributed to calls for pausing or slowing the advance of AI. But given the global push toward AI and accelerated computing, this option is infeasible. Other nations are pursuing their own AI models, and while the United States continues to rank first, China ranks second and is making significant strides with its own AI sector. At the same time, open source AI modeling is making considerable advances, creating additional needs for data centers and AI. If the United States is to remain a global leader—and the market for computational power is to be competitive—the demand for data centers will continue to grow, and policymakers must ensure that power demands can be met in the most efficient manner possible.

Additionally, it is not obvious that more dispersed computational activity is inherently more environmentally friendly than data centers running large AI models. To achieve the same level and quality of output would require substantially more CPU-based computations over a longer period of time. Faster drug discoveries and better climate models that can vastly improve social welfare would be delayed or not even feasible, generating demonstrative social harms. Indeed, a substantial number of lives could be enhanced or saved through new AI driven computations, and slowing or restricting AI use will have a significant impact on society.

AI boosts productivity, thereby increasing economic growth. Therefore, the proper calculation when examining its merits is the cost of achieving the same level of growth and output in its absence: What would it cost (and how long would it take) to provide the same benefits in a more distributed CPU-based world? This is the relevant tradeoff that must be evaluated, and the costs of pausing AI must be acknowledged.

Next Steps

Addressing the demands on the grid as AI becomes more integrated into economic activity is a challenge that policymakers must resolve. And again, AI can be a positive influence for managing the power grid and creating a smarter, more efficient grid. Even without AI and increased demand by data centers, grid management has been a challenge. But researchers suggest that it is important to focus on how AI can be used to improve grid management and reduce energy consumption.

There are several important next steps:

First, identify how AI can enhance grid management by creating better opportunities for AI-powered energy storage systems, better management of renewable loads on the grid and cleaner electricity generation at power plants. Second, utilize AI to reduce energy consumption in critical sectors such as manufacturing and transportation by eliminating energy waste in factories and other buildings and by improving traffic management. Finally, optimize AI in order to make deep learning models less energy-intensive. One study suggests that better-optimized models can reduce the energy required to train an AI model by 75 percent.

Accelerated computing and the rise of AI hold great promise for the future, with significant societal benefits in terms of economic growth and social welfare. Achieving these gains requires a prudent policy framework that can address issues surrounding energy requirements for the data centers providing computational power for AI models. Rather than a pause, policymakers need to help realize the potential for gains from AI and find ways to address growing energy demand in this important economic sector.

Get the latest in technology policy in your inbox. Sign up for the R Street newsletter today.