Commentary

Is AI Poised To Become A Top Carbon Culprit?

Microsoft’s new artificial intelligence (AI) model, Phi-2, is a small language model capable of running on devices. The company said it can outperform other language models from companies such as Meta and Google. But not without a consequential tradeoff: its potential carbon footprint.

Phi-2 is a 2.7 billion-parameter language model that has demonstrated “state-of the-art performance,” Microsoft says, compared with other base models on complex benchmark tests that measure reasoning as well as understanding language, math, coding and common-sense abilities.

Microsoft released the technology in the Microsoft Azure AI Studio’s model catalog, meaning that it is available now for researchers and developers looking to integrate it into third-party applications.

advertisement

advertisement

The focus is on training data for better model performance, which translates to better performance of advertising servers and other platforms built on the technology.

Phi-2, Microsoft's smallest language model, is said to improve common-sense reasoning and language understanding at faster speeds.

Phi-2 is similar to the AI technology that Google unveiled earlier this month, Gemini Nano -- the company's most advanced chip and hypercomputer AI model that also can be used in small devices.

A study published in October estimates the AI industry could consume as much electricity as a country the size of the Netherlands by 2027, as Microsoft, Google, OpenAI and others develop more sophisticated AI systems.

As hypercomputing and sophisticated technologies increase in use to query, search or serve ads, the advertising industry will need to make additional concessions or changes to reduce carbon emissions if they want to reach and keep net-zero standards.

More energy is required as the technology processes information rapidly. As more data is used and stored, advanced cooling systems and data systems are required. All these processing around energy and storage result in higher carbon emissions. 

The amount of carbon emissions released into the atmosphere will continue to increase unless companies take additional measures to reduce it. 

The quality of the training data plays a critical role in model performance, per Microsoft. The training data contains synthetic datasets specifically created to teach the model common sense reasoning and general knowledge, including science, daily activities, and theory of mind, among others. The training data is augmented with filtered web data based on educational value and content quality.

Some experts are leery of the contributions made by AI. Many believe AI can solve potential climate challenges, but the technology’s enormous use of energy -- along with carbon emissions, ewaste, and water -- can overshadow the benefits. 

One ChatGPT query, for example, can generate 100 times more carbon than a regular Google search, according to data cited by Niklas Sundberg, author of the article “Tackling AI’s Climate Change Problem.” 

Training and running AI systems require a great deal of computing power and electricity, which is likely one reason why Bill Gates invested in building a new nuclear power plant in Kemmerer, Wyoming.

“OpenAI’s GPT-3 model is estimated to have used the equivalent of 120 average U.S. households’ annual energy consumption,” Sundberg writes. “An average data center, critical infrastructure for AI, consumes the equivalent of heating 50,000 homes yearly.

Sundberg proposes that AI technologists adopt practices that prompt sustainability and actively seek opportunities to reduce the environmental footprint created by the technology.

Sundberg, a board member of SustainableIT.org and chief digital officer at Kuehne+Nagel, a global transport and logistics company, details best practices for sustainable AI in her book, which outlines the three R’s: relocate, rightsize, and re-architect.

In Sundberg’s article, he discusses environmental costs related to complex large language models (LLMs), data and storage processing, energy sources, water consumption, and hardware.

He also believes companies need to improve sustainability through data management. And in his article, he cites data from 2022 that estimates the world generated 97 zettabytes, or 97 trillion gigabytes, of data that year -- a number that will rise to 181 zettabytes by 2025. All that data requires energy, and all that energy increase carbon footprints.

Most of this data is generated for one-time use, never to be used again -- yet it gets saved on servers that take up space and use huge amounts of electricity, he explains.

AI’s energy demands will only increase, and Sundberg is not the only person with a cautionary tale. A recent study forecasts that by 2027, Nvidia's new AI servers will consume more than 85.4 terawatt-hours annually, exceeding the energy use of countries such as Sweden and Argentina, according to Alex de Vries, a Ph.D. candidate at the VU Amsterdam School of Business and Economics and the founder of research company Digiconomist.

Next story loading loading..