Log In

Politeness to ChatGPT Increases OpenAI's Energy Costs

Published 4 weeks ago3 minute read
Politeness to ChatGPT Increases OpenAI's Energy Costs

OpenAI CEO Sam Altman has revealed that the use of polite phrases such as "please" and "thank you" in interactions with ChatGPT is costing the company tens of millions of dollars. This expenditure arises from the increased computational demands required to process these additional words, adding to the energy consumption of the AI system.

Altman addressed the issue on X (formerly Twitter) in response to a user's inquiry about the electricity costs associated with polite language. He humorously noted that these costs, though significant, are "tens of millions of dollars well spent." This remark sparked a broader discussion about why users feel compelled to be polite to a non-sentient AI.

A December 2024 survey indicated that 67% of American users are polite to AI assistants. Of these, 55% do so because they believe it is the right thing to do, while 12% are motivated by a concern that mistreating AI could have future repercussions. This reflects a growing trend of treating AI more like a conversational partner than a mere tool.

Engineer Carl Youngblood suggests that treating AIs with courtesy is a "moral imperative" and a form of personal development, arguing that callousness in daily interactions can erode interpersonal skills. Microsoft’s design manager Kurtis Beavers adds that respectful inputs can encourage more collaborative and refined responses from generative AI, suggesting that politeness can enhance the user experience.

The computational cost of processing polite language is not insignificant. Each additional phrase increases the processing power needed, placing greater demand on data centers and their cooling systems, thus driving up electricity usage. For premium ChatGPT users, the use of polite phrases could also slightly increase costs, as some paid versions charge based on token usage, which is influenced by word count.

Despite the costs, Altman's stance suggests that OpenAI prioritizes building intuitive, human-like experiences with its AI. He estimated in April 2025 that ChatGPT has close to 800 million weekly active users, about 10% of the global population. This surge in activity, driven by viral features, has significantly increased the platform's operational demands. OpenAI faces the challenge of balancing the growing popularity of ChatGPT with the environmental and financial strains of maintaining large-scale AI operations.

Debate continues over the actual electricity consumption of ChatGPT queries. A September 2023 research paper estimated that a single query requires around three watt-hours of electricity, while other data suggests a lower figure of approximately 0.3 watt-hours due to more efficient models and hardware. Nevertheless, the cumulative effect of millions of users employing polite language contributes noticeably to OpenAI's energy bill.

Social media responses to Altman's revelation have varied from humorous to cautionary, with some users joking about potential AI uprisings and others questioning the lack of solutions to reduce electricity costs on courtesy words. Despite these concerns, Altman maintains that the investment in user-friendly AI is worthwhile.

From Zeal News Studio(Terms and Conditions)
Loading...
Loading...

You may also like...