Each query typed into a large language model (LLM) like ChatGPT requires energy and produces CO2 emissions, but the amount of emissions released depend on the model, subject matter, and user, according to scientists at Germany's Hochschule München University of Applied Sciences.
Researchers compared 14 models and found that complex answers result in more emissions than simple answers, and models providing more accurate answers produce more emissions.
The more an AI model thinks, the more carbon it emits. Reasoning models can produce 50 times more emissions than concise ones. The models use tokens, which are words or part of words, to convert them into a string of numbers processed by the LLM to come up with an answer.
The conversion and computing processes produce CO2 emissions, according to the researchers who measured and compared the CO2 emissions of a variety of trained LLMs using a set of standardized questions.
advertisement
advertisement
The research suggests that users become more aware of the words they enter into the chatbot to control the amount of CO2 emissions caused by AI. To reduce the amount of carbon emissions, users can adjust their personal use of the technology, the researchers said.
The study, published in Frontiers, examined 14 large language models (LLMs) ranging from seven to 72 billion
parameters, asking them the same 1,000 questions across a range of subjects. Reasoning-enabled models on average created 543.5 “thinking” tokens per question.
This can be
compared with more concise models, which required just 37.7 tokens per question. The more tokens used, the higher the emissions.
For DeepSeek R1 -- with 70 billion parameters -- to answer
600,000 questions would create CO2 emissions equal to a round-trip flight from London to New York. Qwen 2.5 -- at 72 billion parameters -- can answer over three times as many questions, or about 1.9
million, with similar accuracy rates and the same number of emissions.
LLM models are not the only AI platforms that emit large amounts of emissions. Carbon accounting specialist Greenly in its latest study estimated the growing ecological cost of music streaming, which is focused on Spotify's music streaming service.
Spotify’s growing use of AI contributes to carbon emissions. Greenly has found that although Spotify includes a section on its carbon footprint in its annual report, the data has been incomplete since 2021.
To calculate an estimate on how much carbon Spotify’s service emits, it used the company’s last complete dataset from 2021 and adjusted it to reflect Spotify’s growth in users between 2021 and 2025. By Q1 2025, the streaming company had an estimated 678 million users -- a 67% increase from 406 million in 2021.
Based on this approach, Greenly estimates that Spotify will emit 187,040 tonnes of CO2e in 2025 -- roughly 12 times the most recent carbon footprint of Vatican City. This marks a 67% increase from 112,000 tonnes in 2021, with emissions averaging roughly 1.04g of CO2e per hour of listening.
Greenly estimates that the average user would emit around 276g per year listening to Spotify. When all listeners are combined, that becomes significant globally.
Video creates even more carbon. Spotify in 2024 began rolling out video clips for certain tracks -- a shift toward a significantly more energy-intensive format. Streaming video for an hour can generate up to 55g of CO2e, more than 50 times more than an hour of audio streaming.
Today, the feature is limited to Spotify Premium subscribers, which come in at about 268 million as of Q1 2025. If most users continue to listen passively, according to Greenly, with the app running in the background and the screen locked, interest in video content may remain limited.