“The United States is in a race to achieve global dominance in artificial intelligence. Whoever has the largest AI ecosystem will set global AI standards and reap broad economic and military benefits.”
– America’s AI Action Plan, July 25, 2025
That’s the U.S. government’s own language. An arms race.
Artificial intelligence is no longer framed as a research project or an economic opportunity. It is being cast as a struggle for survival and global power, a modern Manhattan Project.
Yet just last week, on Aug. 26, the Congressional Research Service released a Frequently Asked Questions memo designed to help lawmakers get on the same page about the basics: what a data center is, how many exist, and how much electricity data centers consume.
If even government institutions are still in the process of aligning their understanding, it’s clear that citizens will need to move quickly to understand what is happening and to understand what it means for their daily lives.
The memo laid out in plain language what many assumed lawmakers already understood.
A data center is a specialized building that houses thousands of servers. There are about seven thousand worldwide, with the largest concentration in the United States, especially in Northern Virginia and Texas. In 2022, American data centers consumed about 176 terawatt-hours of electricity—roughly 4 percent of all U.S. demand, more than many entire states. Projections suggest an additional 35 to 108 gigawatts of demand by 2030. The midpoint estimate, 50 gigawatts, is enough to power every home in California.
The very fact that such a memo was necessary highlights a structural reality: the pace of technological build out is outstripping the pace of legislative comprehension. If institutions themselves are still catching up, it underscores how important it is for citizens to get informed now, before the costs mount even higher.
While Congress is being briefed on “Data Centers 101,” the executive branch has been preparing all year for the AI race that is already underway:
On January 20, 2025, the White House declared a National Energy Emergency.
On April 8, an order was issued to strengthen grid reliability, with the Department of Energy (DOE) tasked to model how AI demand would reshape the grid.
Four months later, on July 2, DOE’s report warned bluntly: “Retirements plus load growth increase risk of outages by 100x. Status quo is unsustainable.”
Just weeks later, on July 23, a new order accelerated federal permitting of data centers, opening federal lands to construction. And on July 25, the White House released America’s AI Action Plan, framing AI as the next great geopolitical race.
Energy Secretary Chris Wright put it plainly: “We are taking a bold step to accelerate the next Manhattan Project—ensuring U.S. AI and energy leadership.” So on one side of our government, institutions are receiving crash courses on the fundamentals. On the other, the executive branch is already issuing a call to arms.
For many Americans, the gap between government priorities and local realities shows up in one place: the monthly electric bill. Reports are now showing how in Columbus, Ohio, households on standard utility plans saw increases of about 20 dollars a month (roughly $240 a year) linked directly to AI data centers. In New Jersey, Pennsylvania, and Ohio this summer, bills jumped by ten to 27 dollars a month.
In Oregon last year, utilities warned regulators that consumers needed protection from rate hikes caused by data centers. And in the Mid-Atlantic, regulators cited data centers as one of the main reasons for projected 20 percent increases in household electricity costs by 2025.
The complaints about rising bills suggest something deeper. Citizens are starting to connect the dots before Washington fully has. If households can feel the costs already, then citizens cannot wait for official briefings, they must demand clarity and prepare themselves. Part of the confusion comes from the nature of artificial intelligence itself. To most people, AI feels intangible. It lives in the “cloud.” You type a question, get an answer, and never see the machinery behind it. No one sends you a receipt for the power you used to get your answer.
But AI is not weightless. It runs on football-field-sized data centers, packed with servers that must run day and night. These machines use staggering amounts of electricity and water to stay cool. A Google search consumes about 0.3 watt-hours of electricity. An AI chatbot query can use up to ten times more—around three watt-hours.
Keep reading
You must be logged in to post a comment.