Tuesday, December 16

Investigating environmental consequences, energy consumption of AI


(Kaylen Ho/Daily Bruin staff)


The incorporation of artificial intelligence in college curriculums raises concerns over the technology’s environmental consequences.

MIT News reported that generative AI is currently experiencing a “gold rush” because of its rapid growth and increasing popularity among consumers. However, the computational power used to train and perfect generative AI models requires large amounts of energy, leading to increased carbon dioxide emissions that negatively affect the environment.

AI’s high energy consumption stems from two main sources: the process of generating answers to user queries and the training of the technology.

Jason Cong, a distinguished computer science professor and director of the UCLA Center for Customizable Domain-Specific Computing, said the internal processes that AI uses to produce accurate predictions consumes large amounts of energy.

“People like you and I keep asking ChatGPT questions, so it will use a fixed model to answer those questions,” Cong said. “But remember that they have, let’s say, 500 million models to answer your question, they still need to go through every single parameter to do the computation and give you an answer … so that’s where the energy sources come in.”

AI’s main internal operations rely on data centers, which are large facilities of technological infrastructure that store and process information to train and carry out AI’s high-intensity workloads. These data centers store hardware such as servers, storage systems and networking equipment designed for advanced AI and machine learning tasks.

The data centers supporting AI’s complex computations require extensive energy and cooling resources – so much so that, in 2023, Pennsylvania State University reported that data centers consumed around 4.4% of total U.S. electricity consumption.

As AI technology becomes more popular, Eric Fournier, the research director at the California Center of Sustainable Communities within UCLA’s Institute for the Environment and Sustainability, said technology companies will invest in their own energy supplies without much consideration of its environmental consequences.

Fournier said AI companies are building new data centers to supply the energy needed for their technology software. Since these new data centers could overwhelm local utilities’s power supplies, he said he predicts major tech companies will start constructing their own energy-generating facilities for their own use. He added that the lack of regulation regarding private energy production could lead to unintended environmental consequences.

“It is possible that you could see a situation where you have companies that just really are in a race to get to the next best model … and thus start building as much power generation as they can, and not really caring so much about the environmental, the carbon emissions, impact of that,” Fournier added.

Similarly, training AI software demands large amounts of data, which in turn consumes significant energy to process and refine.

In order to train generative AI models like ChatGPT to answer user queries with accuracy, the software studies data sets from the internet, according to the OpenAI website. These data sets consist of a collection of audio, images and text from the web, which the model draws on to make accurate predictions when generating responses.

Cong said the training of AI models differs from training standard search engines like Google, as AI models learn by correcting past mistakes and improving their responses to avoid repeating wrong answers in its outputs.

Unlike AI, search engines are trained using software that scans existing content to make it searchable, rather than using that content to generate new and original responses, according to the International Business Machines Corporation website.

As a result, Goldman Sachs found that when compared to an equivalent query on Google, ChatGPT uses nearly 10 times as much electricity to generate a response. For every query on ChatGPT, NPR found that the site uses enough energy to power a light bulb for approximately 20 minutes.

Beyond energy overconsumption, AI systems also require immense water use that increases negative environmental impacts.

According to a 2025 research study by UC Riverside and the University of Texas at Arlington researchers, water usage in AI is derived from three aspects: on-site data center cooling, off-site electricity generation, and indirect water use for server manufacturing.

The study estimated that the GPT-3 model consumes 500 milliliters of water, or around one bottle of water, for every 10 to 50 questions answered. The researchers also found that in just the training of the GPT-3 model alone, 5.4 million liters of water were used to cool a data center, which is equivalent to 10.8 million bottles of water.

As a result of high water usage in the training and running of AI operations, the same study reported that AI-related water withdrawal will be expected to rise to between 4.2 to 6.6 billion cubic meters globally by 2027, which is more than the total annual water withdrawal of half of the United Kingdom.

The environmental impacts of AI become increasingly concerning as AI usage grows among college students, according to a 2025 OpenAI report.

The report found that over one-third of college-aged students use ChatGPT, with one-quarter of them using the site for schoolwork.

Nicole Rod, a first-year chemical engineering student, said she frequently turns to ChatGPT for academic help.

“I do use ChatGPT, probably every day because I do study every day,” Rod said. “I use it to help understand materials, I’m stuck when professors aren’t there to help me out.”

Institutions have also facilitated the increasing prevalence of AI in education.

UCLA has partnered with OpenAI and Google to encourage students and staff to ethically engage with AI for learning and research. According to UCLA Digital & Technology Solutions, UCLA was the first university in California to implement the use of Open AI ChatGPT in its operations and provide students with access to AI technologies like Google’s Gemini chat.

While colleges increase opportunities for students to engage with AI, Rod said it is important to ensure students are aware of the potential environmental impacts of using such systems.

“I wish there was more information about how ChatGPT is affecting it (the environment), considering that a lot of people use it,” she said.

As universities integrate AI into classrooms, its faculty have also taken on a role to address its growing environmental impacts. Cong said several departments at UCLA are working to reduce the environmental impact of AI by developing more energy-efficient algorithms, designing infrastructure that uses less electricity and creating public policy that advocates for energy efficiency.

Cong said his research team is focused on improving the efficiency of AI models, noting that his team’s recent research projects successfully improved the energy efficiency of a new AI model by 50 times after introducing more advanced long-term libraries of memory to increase efficiency and decrease energy consumption.

As students continue to utilize AI tools as part of their daily routines, Cong said these hidden environmental costs should be highlighted. Despite the environmental challenges, he added he is hopeful about the future of AI technology.

“I will encourage everyone to don’t be afraid of AI and embracing it,” Cong said. “It’s exciting technology, and whether we like it or not, is going to be with us, so we may just make the best use of that for the benefits of mankind.”


Comments are supposed to create a forum for thoughtful, respectful community discussion. Please be nice. View our full comments policy here.

×

Comments are closed.