Arm CEO Rene Haas cautions that if AI continues to get more powerful without boosts in power efficiency, datacenters could consume extreme amounts of electricity.
Haas estimates that while US power consumption by AI datacenters sits at a modest four percent, he expects the industry to trend towards 20 to 25 percent usage of the US power grid by 2030, per a report from the. He specifically lays blame at popular large language models such as ChatGPT, which Haas described as"insatiable in terms of their thirst."report expects power consumption for AI datacenters around the world to be ten times the amount it was in 2022.
If Google were to switch its search engine entirely to AI software and hardware, it would increase its power draw by ten times according to the report, requiring an extra 10 terawatt-hours of electricity per year. The Electricity 2024 report says government regulation will be necessary to keep the power consumption of datacenters in check.may even see a third of its electricity used by datacenters
in 2026. But it seems that the power shortage in Ireland is already starting. Amazon Web Service servers there seem to be Increasing efficiency as Haas suggests is one possible solution to the crisis since it's hard to imagine datacenters reducing power by compromising on performance. Even if AI hardware and LLMs get more efficient, that doesn't necessarily mean electricity usage will go down. After all, that saved energy could simply be used to expand computing capacity, keeping power draw the same..
Italia Ultime Notizie, Italia Notizie
Similar News:Puoi anche leggere notizie simili a questa che abbiamo raccolto da altre fonti di notizie.
Fonte: OilandEnergy - 🏆 34. / 68 Leggi di più »
Fonte: SkyNews - 🏆 35. / 67 Leggi di più »