A Smarter Way To Train AI

November 7, 2024

 • 

Read time:

<5 mins
AI1200

A less wasteful way to train large language models, such as the GPT series, finishes in the same amount of time for up to 30% less energy, according to a new study from the University of Michigan. The approach could save enough energy to power 1.1 million U.S. homes in 2026, based on Wells Fargo’s projections of AI power demand. It could also take a bite out of the International Monetary Fund’s prediction that data centers could account for 1.2% of the world’s carbon emissions by 2027—and the water demands that come with that energy use.

More from the Alumni Education Gateway
Join the Alumni Education Gateway Email List​
We use cookies to ensure you get the best experience on our website. By using this site, you accept our use of cookies.