AI data centres are notoriously power inefficient, with each ChatGPT and Gemini query adding to a person's carbon footprint. OpenAI is now keen to play down the energy requirement. Frank ...
Morning Overview on MSN
How much energy does each ChatGPT prompt really use?
Every time someone types a question into ChatGPT, a small but measurable amount of electricity is burned in distant data centers. The figure for a single prompt sounds tiny, yet at global scale it ...
When you buy through links on our articles, Future and its syndication partners may earn a commission. Credit: Getty Images / Tomohiro Ohsumi / Stringer Sam Altman says a ChatGPT prompt uses "0.34 ...
Sam Altman said the average ChatGPT query consumes about one-fifteenth of a teaspoon of water. This is on top of the 0.34 watt-hours of electricity needed to power the chatbot. Altman previously said ...
Altman previously said the average ChatGPT query consumes as much energy as a lightbulb would in a couple of minutes. But generating realistic video clips using more advanced models like Sora 2 is ...
Chatbots can be overly agreeable. To get less agreeable responses, ask for opposing viewpoints, multiple perspectives, and a ...
ChatGPT could soon embrace an open standard that will let users create custom one-word shortcuts for repetitive tasks, and ...
(CNN) — OpenAI is partnering with Broadcom to design and develop 10 gigawatts of custom AI chips and systems, a massive amount of power that will use as much electricity as a large city. The move ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results