Altman's Energy Defense: AI Consumption Compared to Human Habits

Published 7 hours ago3 minute read
Uche Emeka
Uche Emeka
Altman's Energy Defense: AI Consumption Compared to Human Habits

OpenAI CEO Sam Altman recently addressed critical concerns surrounding the environmental impact of artificial intelligence, particularly focusing on water and energy consumption, during an event hosted by The Indian Express in India. Altman, who was in the country for a major AI summit, made strong statements challenging prevailing narratives about AI's resource footprint.

Altman unequivocally dismissed fears regarding AI's water usage as “totally fake,” clarifying that while it was a legitimate issue in the past due to evaporative cooling in data centers, current practices have rendered such concerns obsolete. He directly refuted claims circulating online, such as “Don’t use ChatGPT, it’s 17 gallons of water for each query,” labeling them as “completely untrue, totally insane, no connection to reality.”

While debunking water usage myths, Altman acknowledged that energy consumption presents a “fair” and valid concern, not on a per-query basis, but in total, given the world’s increasing reliance on AI technologies. In response to this growing demand, he emphasized the urgent need for a global transition “towards nuclear or wind and solar very quickly” to power AI infrastructure sustainably.

The discussion also touched upon the broader context of data center resource usage, noting the absence of legal requirements for tech companies to disclose their energy and water consumption. This regulatory gap has prompted scientists to conduct independent studies into the matter, with data centers also being linked to rising electricity prices in some regions. Altman further addressed a specific claim, questioned by the interviewer based on a conversation with Bill Gates, that a single ChatGPT query consumes the equivalent of 1.5 iPhone battery charges. Altman emphatically stated, “There’s no way it’s anything close to that much,” thereby dismissing this particular energy consumption estimate.

Altman also expressed his view that many discussions comparing AI's energy usage are “unfair.” He particularly criticized comparisons that focus on “how much energy it takes to train an AI model, relative to how much it costs a human to do one inference query.” Offering a provocative counter-argument, he suggested that training a human is an immensely energy-intensive process, requiring “like 20 years of life and all of the food you eat during that time before you get smart.” He extended this thought to include the vast evolutionary energy expended by billions of humans over history to develop intelligence and societal advancements.

Therefore, from Altman’s perspective, a more equitable comparison would be the energy required for ChatGPT to answer a question once its model is trained versus the energy a human expends to answer the same question. He posited that, by this metric, “AI has already caught up on an energy efficiency basis,” suggesting that AI inference is becoming as energy-efficient, if not more so, than human cognitive inference.

Loading...
Loading...
Loading...

You may also like...