By Josh Claman
The AI revolution is here, with the promise to produce breakthroughs that can save and enhance billions of lives. But the brilliant gleam of that potential is preventing us from clearly seeing a huge concern — we may not have enough electricity to power the growing number of AI-focused data centers.
AI’s voracious appetite for power is driving an unprecedented demand for electricity. By some estimates, if a data center replaced traditional servers with those designed for AI, the power needed would increase 4x to 5x — equivalent to adding a nuclear power station.
Nvidia, the world’s largest maker of AI chips, is expected to sell 3.5 million units of its popular H100 processors this year. Together, those chips would consume more electricity per year than all of the households in Phoenix and more than some small countries like Georgia, Lithuania or Guatemala. And that’s just from one chipmaker.
AMD, Intel and others are also producing AI-optimized chips.
To handle that growing demand, data center experts estimate a need for 18 to 30 gigawatts of new capacity over the next five to seven years in the United States alone, and our current infrastructure may not be equipped to handle this surge.
Tech connecting AI
Tech companies deserve credit for investing heavily in renewable energy, whether through power-purchase agreements with solar or wind farm operators or by purchasing renewable energy certificates that help power companies pay for the creation of renewables.
But renewable energy isn’t a great fit for data centers, which need a consistent power source to stay running. Future possibilities might come from nuclear or geothermal energy sources, although none of those are currently available on a commercial scale. And even if we can generate enough power, upgrading transmission and distribution systems in time remains a significant challenge.
The potential adverse effects of the AI boom extend into the communities near these growing data centers. Residents there face disruptions to their quality of life, including construction noise, increased traffic and strain on local resources.
Groups in northern Virginia have been very vocal about their concerns about the harmful effects of unrestrained data center growth and have pushed for more regulation, similar to that in Singapore and the Netherlands, which have imposed moratoriums on new data center builds.
The responsibility for building and maintaining an efficient and sustainable data center falls to the operators. Data centers use around 40% of their power allocation globally on inefficient cooling infrastructures. As power becomes the containing resource, every watt must be efficiently allocated to compute. If the industry doesn’t stay ahead of these concerns, regulators will undoubtedly step in.
Governing growth
The speed of this AI-driven paradigm shift has clearly caught legislators and industry leaders flat-footed as they struggle to balance AI’s potential and the public backlash. More comprehensive oversight is urgently needed. The good news: there’s a precedent for that type of action.
For decades, factories polluted the air and water throughout America. Public concern began to spike in the 1960s after Rachel Carson’s book “Silent Spring” highlighted the widespread use of pesticides. Following environmental disasters like a massive oil spill in California and Cleveland’s Cuyahoga River catching on fire because of chemical contaminants, the government passed the National Environmental Policy Act, requiring federal agencies to assess the environmental effects of their actions and decisions. The Environmental Protection Agency was created less than a year later.
We don’t have decades to wait to assess the impact of AI. The current trajectory for AI’s power consumption is economically and environmentally unsustainable. It’s time for our own Environmental Protection Act for AI, with measures such as capping power usage effectiveness and power consumption, along with creating incentives for more sustainable practices such as liquid cooling, are crucial steps in the right direction.
Because AI will be so beneficial is precisely the reason to make sure the surrounding infrastructure is as efficient as possible. The success of AI should not come at the expense of our planet’s well-being. It’s time for stakeholders to prioritize sustainability and work toward a future where AI innovation is groundbreaking and environmentally responsible. The stakes are high, and the time to act is now.
Josh Claman is the CEO of Accelsius, makers of direct-to-chip, two-phase cooling technology. An advocate for the power of transformative technology throughout his 30-year career, Claman has grown and repositioned businesses at Dell, NCR and AT&T.