
OpenAI CEO Sam Altman admitted that straightforward courtesies comparable to saying “please” and “thanks” to his company’s ChatGPT bot are surprisingly expensive — costing the firm substantial sums in electricity expenses.
Altman made the revelation when a user on the social media platform X asked in regards to the financial impact politeness towards AI might need on OpenAI’s operating costs.
Altman responded by noting the loss was “tens of thousands and thousands of dollars well spent,” adding cryptically: “You never know.”
Chatbots comparable to ChatGPT run on large language models (LLMs), which depend on extensive computational infrastructure hosted in data centers.
These models require hundreds of high-performance GPUs (graphics processing units) to operate efficiently.
The GPUs perform vast amounts of parallel processing to interpret prompts and generate responses in real time.
Powering these data centers demands enormous amounts of electricity.
It’s estimated that generating a single AI-written response, comparable to a brief email or paragraph, can devour as much as 0.14 kilowatt-hours (kWh) of energy — comparable to keeping 14 LED bulbs lit for one hour.
When scaled across billions of interactions every day, the cumulative energy usage becomes significant.
Globally, data centers already account for about 2% of total electricity consumption.
With the rapid expansion of AI applications and increasing demand for generative AI services like ChatGPT, experts warn that this figure could rise sharply in the approaching years.
While some might view courteous interactions with chatbots as unnecessary, several AI experts argue that politeness significantly shapes AI interactions positively.
Kurtis Beavers, a director on the design team for Microsoft Copilot, has advocated for respectful prompts, stating they “help generate respectful, collaborative outputs.”
In response to Beavers, polite phrasing doesn’t merely reflect good manners but actively influences how the AI replies, setting a more constructive and skilled interaction tone.
“When it clocks politeness, it’s more prone to be polite back,” in line with Microsoft WorkLab, a digital publication produced by the software giant that’s specifically dedicated to integration of AI within the workplace.
Politeness towards AI has develop into increasingly common.
A 2024 survey revealed that roughly 67% of American users repeatedly employ courteous language when interacting with chatbots.
Inside that group, a majority (55%) consider politeness is ethically correct, while one other 12% humorously indicated that their polite language serves as insurance against potential AI revolt.

OpenAI CEO Sam Altman admitted that straightforward courtesies comparable to saying “please” and “thanks” to his company’s ChatGPT bot are surprisingly expensive — costing the firm substantial sums in electricity expenses.
Altman made the revelation when a user on the social media platform X asked in regards to the financial impact politeness towards AI might need on OpenAI’s operating costs.
Altman responded by noting the loss was “tens of thousands and thousands of dollars well spent,” adding cryptically: “You never know.”
Chatbots comparable to ChatGPT run on large language models (LLMs), which depend on extensive computational infrastructure hosted in data centers.
These models require hundreds of high-performance GPUs (graphics processing units) to operate efficiently.
The GPUs perform vast amounts of parallel processing to interpret prompts and generate responses in real time.
Powering these data centers demands enormous amounts of electricity.
It’s estimated that generating a single AI-written response, comparable to a brief email or paragraph, can devour as much as 0.14 kilowatt-hours (kWh) of energy — comparable to keeping 14 LED bulbs lit for one hour.
When scaled across billions of interactions every day, the cumulative energy usage becomes significant.
Globally, data centers already account for about 2% of total electricity consumption.
With the rapid expansion of AI applications and increasing demand for generative AI services like ChatGPT, experts warn that this figure could rise sharply in the approaching years.
While some might view courteous interactions with chatbots as unnecessary, several AI experts argue that politeness significantly shapes AI interactions positively.
Kurtis Beavers, a director on the design team for Microsoft Copilot, has advocated for respectful prompts, stating they “help generate respectful, collaborative outputs.”
In response to Beavers, polite phrasing doesn’t merely reflect good manners but actively influences how the AI replies, setting a more constructive and skilled interaction tone.
“When it clocks politeness, it’s more prone to be polite back,” in line with Microsoft WorkLab, a digital publication produced by the software giant that’s specifically dedicated to integration of AI within the workplace.
Politeness towards AI has develop into increasingly common.
A 2024 survey revealed that roughly 67% of American users repeatedly employ courteous language when interacting with chatbots.
Inside that group, a majority (55%) consider politeness is ethically correct, while one other 12% humorously indicated that their polite language serves as insurance against potential AI revolt.







