The Concerning Truth About America's AI Data Centers

It's no secret that artificial intelligence has a PR problem. As everyone tries to discern whether AI is good or bad, the rise of the technology has been fraught with controversy, from generative AI models stealing content to concerns over workers being replaced by machine intelligence. AI doesn't need any more negative press, so the latest news regarding AI data centers and their extensive power usage certainly isn't helping. These power-hungry facilities are already causing environmental damage that is only set to get much worse in the near future, with everything from carbon emissions to exorbitant water usage and noise pollution among the issues raised by AI data center construction.

The world is struggling to keep up with AI's rapid proliferation, and the extensive emissions produced by this emergent tech is just one part of that story. According to Business Insider, "giant warehouses" housing fields of servers are being built in the U.S. at a rate of more than two each week — some of which require as much power as an entire city to run. Traditional data centers are bad enough, but those servers that support AI systems will be even worse. According to Data Center Frontier, whereas traditional data centers require around 30 megawatts of energy per facility, AI data centers require around 200 megawatts to power their GPU clusters. Per Business Insider, on the smaller, server cabinet level, these cabinets will have to increase their power consumption from 5–10 kilowatts to 70–100 kilowatts.

That's a problem in a world where emissions and power usage are becoming increasingly excruciating issues. Not only does there seem to be little concern among the companies building these centers, regulations seem to be disappearing rather than becoming more robust. Amid more long-term concerns about whether AI will one day overtake humans, the power demands of the technology are often overlooked — but they shouldn't be.

AI data center power demands are significant, and set to get much worse

Tech companies will take advantage of sites that provide enough tax breaks, water, and power to maintain large clusters of data centers but this often means building the structures near residential areas. Loudoun County in northern Virginia hosts one of the largest concentrations of data centers in the world. The county's official website claims these server farms have helped diversify and bolster the local economy with new jobs and revenue but that doesn't appear to be the full story. 

As Business Insider's investigation showed, the largest data centers can consume as much as 2 terawatt hours (2,000 gigawatts, or 2 trillion watts) of electricity per year. Meanwhile, a 2025 Data Center Power Report suggests that while these facilities currently have 3% to 4% share of U.S. electricity consumption, that will increase to 8% to 12% by 2030. If that proves accurate, the authors predict an extra 35 gigawatts of power will be required, which equates to 35 billion watts. Aside from the emissions this will produce, the centers themselves have an immediate impact on the communities around them. These facilities typically run advanced air conditioning systems that not only suck more power but increase the already significant hum produced by the servers. This has disrupted the lives of residents across the U.S. who claim they're unable to be in their own homes without feeling the constant rumble of the nearby server centers.

There's also the issue of water usage. According to a report from the Environmental and Energy Study Institute, larger data centers require up to 5 million gallons of water per day to fuel their cooling systems, and with more power-hungry AI centers being built that figure looks set to increase. When you consider that many of these facilities are being built in drought-ravaged areas, things only start to look more bleak.

Regulators should be getting ready for the AI power suck — but they're not

As concerns over climate change continue to increase, things only seem to be getting worse, and AI data centers are playing a big part. Business Insider built a map of facilities that have already been built or were approved to be built at the end of 2024, which ultimately comprised 1,240 sites — almost more than four times the amount from 2010. Meanwhile, the International Energy Agency's 2025 Energy and AI report details how the rapid advance of AI has resulted in a huge increase in power demand, with data center electricity consumption growing by roughly 12% per year since 2017. That's more than four times faster than the total electricity consumption rate. All of this means more emissions, and with inadequate regulations in place, that's potentially disastrous for our imperiled climate.

While you would hope regulators are preparing to handle these issues, things actually seem to be moving in the opposite direction, especially in the U.S., which was responsible for 45% of global data center electricity consumption in 2024. In July 2025, President Donald Trump's administration moved to revoke the EPA's "Endangerment Finding,"  a 2009 declaration which stated that CO2 and greenhouse gases pose a risk to public health and welfare. That same month, the EPA publicized its plans to rescind the Greenhouse Gas Reporting Program, announcing that, if approved, companies in the U.S. would no longer have to submit their greenhouse gas emissions each year.

Put simply, it doesn't look as though the exorbitant emissions of these AI data centers will face much regulation. Meanwhile, states are offering tax incentives to companies to encourage them to build more data centers. While we have no idea what will happen after the technological singularity, we do know that power centers being allowed to pump out greenhouse gases unimpeded isn't going to improve the environment or ameliorate the worst effects of global warming.

Recommended