The Brief
OpenAI CEO Sam Altman said polite prompts to ChatGPT come with real energy costs.
He estimated that courtesy adds up to “tens of millions of dollars” in electricity.
Experts say polite language also improves AI response tone and professionalism.
LOS ANGELES – Being polite to ChatGPT might feel like good manners — and according to OpenAI CEO Sam Altman, it’s also a surprisingly expensive habit.
In a post on X (formerly Twitter), Altman revealed that saying “please” and “thank you” to ChatGPT costs the company “tens of millions of dollars” due to the increased computational and energy load required to process polite, often more complex, prompts. Still, he called the expenditure “well spent,” adding cryptically, “You never know.”
Why does politeness cost more when talking to AI?
By the numbers
At the core of ChatGPT and similar AI tools are large language models (LLMs), which rely on thousands of high-performance GPUs to function.
These graphics processing units operate in vast data centers, where they handle billions of user queries daily. According to OpenAI and hardware partner NVIDIA, this setup is essential for the kind of real-time language generation users expect from tools like ChatGPT.
Each AI-generated response comes at a cost — both financially and environmentally. A 2023 study from the University of Washington and the Allen Institute for AI estimated that a single interaction with an LLM can consume up to 0.14 kilowatt-hours (kWh) of electricity — roughly equivalent to keeping 14 LED lightbulbs on for an hour.
When applied across millions of interactions daily, the total energy consumption becomes substantial.
Globally, data centers already account for about 2% of total electricity usage, according to the International Energy Agency (IEA). And with AI use rapidly expanding, the IEA warns that figure could rise significantly if more energy-efficient technologies aren’t adopted.
Does politeness actually matter to AI?
What they’re saying
Microsoft Copilot designer Kurtis Beavers has emphasized the value of respectful prompts, telling Microsoft WorkLab that politeness helps generate “respectful, collaborative outputs.”
FILE – The App Chat GPT application can be seen on the display of a smartphone.
(Photo by Silas Stein/picture alliance via Getty Images)
“When it clocks politeness, it’s more likely to be polite back,” Beavers noted, adding that tone-setting can influence workplace efficiency and user experience with AI tools.
A 2024 survey found that 67% of Americans regularly use courteous language with chatbots. Among them, 55% say they do so because it’s ethically appropriate, while 12% joked that it might one day spare them in the event of an AI uprising.
What’s next
As generative AI becomes more embedded in everyday work and communication, companies may look for ways to make interactions more efficient — or perhaps even adjust pricing or settings based on usage patterns. For now, saying “please” may cost a bit more, but the industry seems to agree it’s worth the tone it sets.
The Source
This report is based on public statements made by OpenAI CEO Sam Altman on X and supporting analysis from Microsoft WorkLab, a publication by Microsoft focused on AI in the workplace. Additional data was drawn from energy use estimates by industry experts and a 2024 consumer survey on chatbot behavior.