Regular readers will either appreciate or hate that for roughly 20 months I’ve been decorating my articles and presentations with images generated by artificial intelligence algorithms. I’m not going to address the full range of reasons, but will just do some basic math on the electricity use to get a subset of regular readers to give up already.
Let’s pretend that OpenAI’s California team and other almost entirely coast-based, liberal elite, climate-aware developers of LLMs and GIMs are complete idiots. Let’s pretend that they use US grid average electricity, about 0.4 kg CO2e per kWh for generating their LLMs and GIMs. How much carbon debt would accrue?
So let’s assume that OpenAI and Microsoft were still pretty dim, and positioned all of that computing in an Azure data center that’s only 56% better than average, or 0.22 kg CO2e per kWh. That 400 to 4,000 tons shrinks to 220 to 2,200 tons of CO2e, 150 to 1,500 American drivers’ worth to train the models.
Oh, but wait, we aren’t finished. Now we have to actually run a query or generate an image. Remember how much energy that takes, 0.001-0.01 kWh per query. At 0.4 kg per kWh, that’s 0.4 to 4 grams per query. But remember, OpenAI runs its services on Microsoft Azure, and Microsoft is buying GWh of renewable electricity and tons of high quality carbon credits .
日本 最新ニュース, 日本 見出し
Similar News:他のニュース ソースから収集した、これに似たニュース記事を読むこともできます。
ソース: cleantechnica - 🏆 565. / 51 続きを読む »
ソース: verge - 🏆 94. / 67 続きを読む »