The 'Achilles Heel Of AI' Is Energy. Fusion Is Needed To Solve The 'Energy Puzzle'

Sam Altman
CEO of OpenAI, former president of Y Combinator

Data centers are physical facilities that house computing resources, such as servers, storage systems, networking equipment, and other critical infrastructure.

Deep inside the bowels of these buildings, stacks upon stacks of servers and computers are put in grids, working in tandem, to achieve one common goal.

Data centers designed to power the internet, they help store, manage, process and distribute data.

Data centers for generative AI on the other hand, require resources to train AI models and their inference, data processing and more. The hardware needed in these data centers include high-performance GPU's, advanced networking technology to facilitate fast data transfer between GPUs and storage systems, those that help high compute intensity and parallel processing, data throughput, specialized hardware like TPUs designed for deep learning and AI computations, and more.

Sam Altman
Sam Altman.

As a result, data centers for AI can consume a lot more energy than traditional data centers, like the ones that power internet search engines.

In fact, some calculation suggest that a data center for OpenAI's ChatGPT 3.5, the product that started this overall generative AI trend, requires 15 times more energy than a traditional data center.

According to calculations, GPT-3, the first model of ChatGPT that was released for public use, had 175 billion machine learning parameters and was trained on 10,000 Nvidia V100 core GPUs.

It's estimated that the AI required 936 megawatt-hours to power its hardware during its training, which is equivalent to the energy to power a single electric car to the moon and back more than six times, in a journey of more than 5.4 million kilometers.

Because the race is on, and that the only way for AI to go is up and advancing, the demand for energy is increasing.

As OpenAI and others race to make bigger, smarter, and more complex AI models, these companies need to satisfy their AI's thirsts and the hungers for electricity.

This sets up a thorny problem for an industry pitching itself as a powerful tool to save the planet: a huge carbon footprint.

According to Sam Altman, the CEO of OpenAI, his solution is nuclear fusion reactor.

Altman who himself invested in the development of fusion, suggested that the futuristic technology, widely seen as the holy grail of clean energy, will eventually provide the enormous amounts of power demanded by next-gen AI.

"There’s no way to get there without a breakthrough, we need fusion," he said, responding to podcaster and computer scientist Lex Fridman, who asked him how to solve AI’s "energy puzzle."

Alongside that, Altman also suggest an increased focus on other sources of energy, mainly renewable energy sources.

Altman previously admitted during the Davos summit in Switzerland earlier in 2024, that the "Achilles heel of artificial intelligence is its energy consumption."

Nuclear fusion is a potentially revolutionary energy source that has been the subject of extensive research.

Inside nuclear fusion reactors, fusion energy is created when two light atomic nuclei combine to form a heavier nucleus, releasing a significant amount of energy in the process.

Unlike the more traditional fission energy, which breaks down the nucleus of an atom splits into two or more smaller nuclei, and releasing both energy and radioactive decay, fusion energy is much cleaner, because not only that it doesn’t pump carbon pollution into the atmosphere, because it also leaves no legacy of long-lived nuclear waste.

In even happens in nature, like inside stars.

In the sun's core, hydrogen nuclei (protons) fuse to form helium, releasing vast amounts of energy as light and heat.

If ever humanity ever wants to solve AI's energy problem, Altman's suggestion is through a breakthrough in nuclear fusion, because that approach offers a tantalizing vision of a clean, safe, abundant energy source.

OpenAI's long-term goal is reaching AGI (Artificial General Intelligence), which is a term to describe AI that is smarter than humans.

Altman said that future AI requires vast amounts of energy as its capabilities improve, and in order to surpass human intelligence.

The only power source capable of meeting this, according to Altman, is nuclear fusion, which mimics the natural reactions that occur within the sun to produce energy.

"Energy is the hardest part," he said, adding that nuclear fusion is fundamental for solving the "energy puzzle" of providing the computing power to develop next-generation AI.

Fusion energy has made significant strides, with numerous breakthroughs and projects showing promise. However, Here’s a comprehensive look at the current progress in fusion energy

While Altman thinks long-term, there are reasons why fusion energy is difficult to create, and why decades have past, and practical and economically viable fusion power remains a challenging goal.

Replicating the conditions in the center of the sun is a huge challenge, and that at this time, no technology is available to achieve that.

Some experts think that fusion is still a few years, if not decades away from being mastered and commercialized on Earth.

Besides OpenAI, Microsoft has also invested a lot of money on the development of nuclear reactors, just for AI.

"AI will be a powerful tool for advancing sustainability solutions," said a spokesperson for Microsoft, which has a partnership with OpenAI.

The spokesperson also said that the company is "investing in research to measure the energy use and carbon impact of AI while working on ways to make large systems more efficient, in both training and application."