How To Productively Worry About AI Energy Use
OpenAI launched ChatGPT on Nov. 30, 2022. Within five days, the artificial intelligence (AI) chatbot had one million users. Two months later, it had 100 million. The official ChatGPT website reached one billion visits by February 2023; five months later, visits numbered over 9 billion. This explosive growth—along with AI’s high energy consumption—has many people worried. Our goal is not to fuel climate anxiety, but to encourage productive worry about AI energy use.
ChatGPT made a splash with its natural language processing features, but AI can do much more than compose poetry and summarize articles for the “TL;DR” reader. AI analyzes medical imaging results faster and more accurately than humans. It helps farmers predict optimal planting times, fight weeds and reduce pesticide use. Shippers use AI to improve efficiency and predict maintenance needs. From education to disaster response to space exploration, AI is capable of solving big problems.
Natural Incentives for Efficiency
The financial burden of paying the electric bill already encourages AI companies to seek efficiencies. An early application of Google’s DeepMind was to reduce power used to cool data centers. Chip developers are constantly improving AI hardware to work faster while using less energy. More recently, DeepMind applied AI to design more efficient computer chips. AI software startup Hugging Face developed a 40 percent smaller, 60 percent faster version of a Google language processor that achieved nearly the same performance while reducing energy consumption.
But energy efficiency swings a double-edged sword. As AI becomes cheaper, it will be used more often. Newer AI applications may use energy better, but they will also add to energy demand. While the overall effect could be higher or lower energy use, likely efficiency gains will lead to higher total AI energy use. It is the familiar rebound effect economist William Jevons noticed 150 years ago: More efficient steam engines did not conserve coal; rather, they motivated higher coal demand. There is a natural incentive for efficiency in deploying AI, but we should not expect it to cut energy use.
Evaluating Energy Use
A primary tool for productive worry is to frame the question well. If you wonder whether AI uses excessive energy, be sure to ask “Compared to what?” A single chatbot query may require 15 times as much energy as a standard internet search, but the user often gets a quicker answer. It may take a lot of energy to train AI to spot weeds in a soybean field, but the resulting system cuts the amount of energy and pesticides used to grow those soybeans.
When considering AI’s energy use, you should also consider all the energy that would have been used in its absence. AI can improve traffic flows while cutting vehicle fuel use and emissions. It can do the same for air traffic control, with similar results. Consider, too, other resources saved by faster and better solutions—for example, when DeepMind cut energy use for cooling data centers, it also cut water use.
Advancing Sustainable Energy
While AI energy use is growing rapidly, many AI companies are taking steps to ensure they are sourced by renewable or other low-emitting energy supplies. Additionally, AI has been put to work advancing sustainable energy technology and production. There is potential for the use of AI in nuclear fusion research (though even with scientific advances, commercial nuclear fusion may be decades away). More practically, AI has been used to forecast wind farm output, thereby enabling better wind energy use.
The National Renewable Energy Laboratory uses AI to improve geothermal energy exploration and development. Fervo Energy’s recent success in an enhanced geothermal project is one such application. AI is also helping geologists more effectively locate minerals needed for clean energy technologies.
Finally, AI companies are advancing sustainable energy more directly by buying wind, solar and other zero- or low-emitting energy sources to power their systems. Google will buy power produced by Fervo’s geothermal project as part of its company-wide commitment to 24/7 carbon-free energy. OpenAI runs its work through Microsoft’s Azure Cloud Services, which has committed to using 100 percent renewable energy by 2025.
Location, Location, Location
Much concern about AI energy use focuses on global issues, but AI’s rapid growth has a local side to it, too. Data centers often cluster together to increase processing efficiency; however, some of these clusters—such as one in Northern Virginia—threaten to overwhelm local and regional electric infrastructure. When data centers seek clean energy supplies in areas served by regulated monopolies, they often find clean energy programs to be smaller and more costly than competitive clean suppliers. One practical approach is to allow data centers and other large power customers to bypass the monopoly and develop or contract for their own power supplies.
Adding It All Up
AI developers see natural incentives to seek out efficiencies; however, such efficiencies are likely to bring significant rebound effects. AI might well displace more resource-intensive ways of solving problems. It is helping find more carbon-free energy sources, but it is also making oil and gas drilling easier. Though overall electricity demand is higher, AI may be helping to conserve water, reduce pesticide use and improve other environmental outcomes. Even if AI developments work out for the best, our already stressed electric power sector may not be able to handle the added demand.
The worry over AI’s energy demand is not about AI per se—it is more about the resources used to generate the power and the pollution that often results. AI is not inherently more polluting than other electric uses; in fact, as noted above, it may be cleaner than average. Concerns about AI energy use should be directed at the environmental impacts of electric power generation in general.
Market-oriented electric power reforms should continue. Market forces and state policies have helped the United States outpace federal climate emission goals, but continued progress requires reforms to generator interconnection procedures, transmission planning and permitting processes. Nearly 90 percent of planned new power plant capacity is zero emission, so reducing barriers to power plants construction will further cut power sector emissions.
Regions that lack regional transmission organizations (RTOs) should develop RTOs or similarly robust, grid-integrated wholesale power markets. Customer choice is the soundest approach for retail electricity, but states with retail monopolies could implement direct access policies that allow large commercial and industrial customers to secure their own clean resources. State policies should also support large energy consumers wishing to build their own power supplies.
AI promises substantial innovation across many fields, including resource conservation and clean energy development. While AI is driving rapid growth in electric use, restricting it would be shortsighted. Instead, policymakers should continue to address existing pollution challenges in the electric power sector while removing regulatory barriers for AI data center developers and others seeking access to clean energy.