According to Greg Osuri, founder of Akash Network, artificial intelligence is hitting the energy barrier, and as models grow, training may quickly require the same power as a nuclear reactor.
In an interview with Andrew Fenton of Cointelegraph at Token2049 in Singapore, Osuri said the industry is underestimating how much computing demand is doubling. He noted that data centers are already consuming hundreds of megawatts of fossil fuel.
Osuri warned that this trend could cause an energy crisis, lifting the household electricity bill and adding millions of tons of new emissions each year.
“We’re approaching the point where AI is killing people,” he said.
How decentralization can alleviate AI power issues
On September 30, Bloomberg reported that AI data centers were surged in electricity costs in the United States.
The report highlighted how data centers have contributed to rising energy bills for everyday households. According to the report, wholesale electricity costs have skyrocketed 267% over five years in areas near the data center.
Osuri told the Cointelegraph that the alternative is decentralization. Instead of focusing chips and energy on a single megadata centre, Males said distributed training across a network of small mixed GPUs (from high-end enterprise chips to home PC gaming cards) can unlock efficiency and sustainability.
“When the incentives are understood, this takes off like mining,” he said, adding that home computers could ultimately win tokens by providing spare calculated power.
This vision is similar to the early days of Bitcoin (BTC) mining, where normal users can contribute their processing power to the network and get rewarded in return. This time, “mining” is training an AI model instead of crunching a cryptographic puzzle.
Osuri said this could potentially give future interests in AI to everyday people while reducing developer costs.
Related: Nansen announces AI agents for crypto traders, targeting fourth quarter autonomous trading
It’s not that there’s no such challenge
While that possibility cannot be ruled out, Osuri said the challenges still exist. Training large models across patchworks of different GPUs requires technical breakthroughs in software and tuning. He said this is a problem that the industry is just beginning to crack.
“Around six months ago, several companies began to demonstrate some aspects of distributed training,” Osli said.
“No one’s actually running the model together and this could change “by the end of the year.”
Another hurdle is creating a fair incentive system. “The hard part is incentives,” Osli said. “Why are they getting back to someone training a computer? That’s a more difficult challenge to solve than actual algorithmic techniques.”
Despite these obstacles, males argued that distributed AI training was necessary. By spreading workloads across the global network, AI can ease pressure on the energy grid, reduce carbon emissions, and create a more sustainable AI economy.
magazine: More and more users are getting LSD with Chatgpt: AI Eye
