10/29/2025 | Press release | Distributed by Public on 10/29/2025 07:26
Photo: phonlamaiphoto/Adobe Stock
Commentary by Matt Pearl and Kuhu Badgi
Published October 29, 2025
During the British Industrial Revolution of the late eighteenth and early nineteenth centuries, the United Kingdom experienced an unprecedented time of innovation, technological advancement, and economic growth. The most noteworthy invention during this time was the steam engine, which turned coal and boiling water into motion, pushing pistons, turning wheels, and for the first time, supplying reliable power on demand. This invention, however, was insufficient on its own to advance the British economy. Steam engines enabled reliable power generation independent of natural forces such as wind or water currents. Still, they were limited in their utility because of their great expense and the imperfect fit between all the cylinders and components. Beyond these limitations, the engines also had external effects that needed to be mitigated, such as coal-fired boilers that consumed enormous amounts of fuel, polluted waterways, and contributed to urban smog.
Improvements in manufacturing processes-such as the creation of the cylinder boring machine-created stronger and cheaper steam engines. At the same time, pollution was addressed by furnaces capable of better combustion, efficient steam boilers, and safer engineering standards. To make the steam engine scalable, however, incremental improvements were not enough: The United Kingdom ultimately relied on new complementary technologies, such as improvements in metallurgy that created better rails and the development of the mechanized factory system.
Thus, the steam engine did not singlehandedly achieve a technological revolution. Instead, it was also the ability of Britain to harness this technology, make gradual improvements to it, mitigate any problems it created, and combine it with other technological breakthroughs that led to their success. This last factor was critical: the British Industrial Revolution was not marked by one technology alone, but by synergies between several supporting technologies. While today's technologies have advanced far beyond steam engines, the principles of cross-technological integration and diffusion remain just as critical.
U.S. investment in AI technologies has increased to an unprecedented amount. In 2024, U.S. corporate investments reached $252.3 billion, 13 times what they were just a decade earlier. Just four U.S. companies plan to spend over $350 billion on AI-related data centers in 2025 alone. This surge underscores how central AI has become to U.S. technological competitiveness. The Trump administration has reinforced this commitment through the AI Action Plan, continued implementation of the CHIPS Act, and the U.S. Department of Energy's vast increase in funding for AI for scientific research. In July, the Trump administration also signed an executive order to facilitate the development of AI data centers.
The United States is fortunate to have this AI investment, particularly as improvements are needed to prevail in the AI race. Large language models (LLMs) have become indispensable to millions of consumers and businesses, but maximizing their utility requires steps such as reducing the number of hallucinations they produce and increasing their "context windows." Further, it is unclear whether AI will attain superintelligence in a significant number of fields without combining LLMs with complementary disruptive innovations in AI.
Even as improvements in AI are necessary, efforts are also needed to improve responsible resource management related to its effects. AI already accounts for nearly 20 percent of global data center electricity demand, and some estimates suggest this figure could double by the end of 2025, representing almost half of all data center consumption worldwide, excluding bitcoin mining. While precise figures remain uncertain, what is clear is that AI's energy footprint is expanding at a pace unmatched by previous technologies.
It is also evident that the current infrastructure in the United States cannot keep up with this rate of expansion, with data centers already consuming over 4 percent of national electricity, a figure that could rise to 12 percent by 2028.
Infrastructure limits include technological bottlenecks. Connecting AI data centers can enable them to share computational resources and data, but existing fiber routes can prove insufficient. Further, AI scaling is hardware-limited by chips or graphics processing units (GPUs), for which Taiwanese companies currently dominate the market. This dependence leaves the United States in a strategically vulnerable position.
Thus, for the United States to stay competitive with China in AI over the next few years, it is essential to meet the needs of AI development. Upgrading transmission infrastructure, streamlining permitting and siting systems, and accelerating interconnection can help the United States to meet surging demand. Efforts are needed in other areas, such as streamlining state and local approval of new data center facilities and fiber deployments, and building on prior investments in chip manufacturing in the United States. Without initiatives such as these, the rapid prioritization of AI risks running into bottlenecks across energy, land, connectivity, hardware, and future constraints yet to emerge.
At the same time, the United States should find ways to address any adverse effects of AI-while ensuring that those measures are well-targeted. Regarding resource management, the U.S. tech industry has already invested tens of billions of dollars in ensuring that the AI data center buildout uses renewable sources, and it appears that these efforts are accelerating. Additional measures to improve AI's energy footprint include encouraging demand flexibility so data centers can shift loads to periods of abundant renewable generation (as recent CSIS energy analysis emphasized). Regarding the risks of AI, the Trump administration has chosen a targeted focus on the security risks of AI, such as cybersecurity and biosecurity-now, it should focus on producing credible, accurate assessments of such risks.
Beyond AI, the United States should invest in disruptive technological innovations that will combine with AI to increase its capabilities and reduce its costs.
A prime example of such a solution is quantum computing. Quantum computing's efficiency, optimization, and innovation potential could help address many of AI's energy and computational limits and allow the United States to spark the next wave of industrial transformation. Quantum computing should not be considered a substitute for AI technologies, and it will yield important dividends separate and apart from AI. At the same time, quantum computing offers several advantages that allow it to improve the current AI systems employed by the United States. Quantum computing excels at tackling optimization challenges, making it a desirable long-term solution to reducing the energy needs of AI. Specifically, quantum systems can solve certain optimization, simulation, and cryptography problems far more efficiently. Quantum computing can enhance AI and machine learning by generating more reliable data at a larger scale about complex systems that rely upon the principles of quantum mechanics, which will enable more efficient, accurate AI models with wide applications from finance to healthcare.
Some leading technology firms have already begun developing the infrastructure needed to connect quantum processors with advanced AI systems. Rather than treating quantum computers as stand-alone machines, industry trends point toward hybrid architectures in which quantum processors work alongside classical supercomputers and AI accelerators. This approach allows each technology to perform the tasks it is best suited for, and it enables AI techniques to support quantum error correction-an essential step for making quantum systems stable and reliable at scale.
Although today's quantum computers still require specialized cooling systems, their computational power scales are much faster than their energy use, meaning they are projected to be far more efficient than classical AI hardware as the technology matures. Quantum computing can also help identify obstacles in existing technology systems that may be causing energy over-usage. An example of this is in energy forecasting and optimization. When integrated with AI, quantum computing can optimize energy supply and demand by rapidly analyzing grid conditions, identifying potential bottlenecks, and recommending real-time adjustments to energy distribution.
In addition to improving existing AI systems, quantum computing could help solve currently intractable problems and enable the development of new technologies that address limitations in today's AI. These could include the discovery of new materials and drugs, advances in computational fluid dynamics and climate modeling, or the development of energy-saving technologies and more efficient transportation systems. Whatever the application, it's clear that quantum technologies could address some of the world's most pressing challenges with innovative solutions previously considered unattainable by classical computers.
Quantum computing has immense promise, but it also requires significant development. Realistically, scalable, fault-tolerant quantum systems will only be achievable by the latter portion of the decade, due to the technology's engineering and scaling constraints.
For these reasons, the United States needs to invest resources into the development of quantum computing technologies. While the United States is currently leading in private sector funding for quantum development, public underinvestment remains a strategic vulnerability, with the United States investing only a third of China's reported contributions.
The 2018 National Quantum Initiative Act (NQIA) was an important first step in this process, establishing multiple national quantum information science research centers, the NSF Quantum Leap Challenge Institutes, and the Quantum Economic Development Consortium. This year, Congress also provided a significant amount of funding through the reconciliation process for the Quantum Benchmarking Initiative-a core component of the U.S. government's support for quantum research and development (R&D). But this momentum will stall without renewal. Congress should pass the National Quantum Initiative Reauthorization Act, which would provide approximately $2.7 billion in federal funding towards quantum R&D, workforce development, and industry cooperation. This is a small investment relative to the federal budget, but crucial to maintaining U.S. leadership in emerging quantum technologies. NQIA also enjoys bipartisan support and has been reinforced by the efforts of the CSIS Commission on U.S. Quantum Leadership, which underscored that the United States needs to significantly increase investments in the development of quantum technologies to ensure technological and commercial competitiveness.
AI development is vital for U.S. technological competitiveness and merits significant public and private investment. However, such development should be pursued with policies that provide the United States with a sustainable foundation for innovation and secure an enduring edge in the global technology race. To do so, the United States should first implement a variety of interim policies, from permitting reform to green energy incentives, that enable the United States to keep pace with China in the next several years. To prevail in the long term, however, it will be necessary to support other innovations, such as quantum computing, that will complement AI and enable the United States to maximize its disruptive impact.
Matt Pearl is the director of the Strategic Technologies Program at the Center for Strategic and International Studies (CSIS) in Washington, D.C. Kuhu Badgi is a program coordinator and research assistant with the Strategic Technologies Program at CSIS.
Commentary is produced by the Center for Strategic and International Studies (CSIS), a private, tax-exempt institution focusing on international public policy issues. Its research is nonpartisan and nonproprietary. CSIS does not take specific policy positions. Accordingly, all views, positions, and conclusions expressed in this publication should be understood to be solely those of the author(s).
© 2025 by the Center for Strategic and International Studies. All rights reserved.