Up to 30% of the Power Used to Train AI Is Wasted

November 7, 2024

 • 

Read time:

<5 mins
AI1200

Imagine cutting AI energy consumption by 30% without sacrificing performance—what if this breakthrough could power millions of homes instead? U-M researchers have developed a method that optimizes how AI tasks are distributed across processors, potentially saving huge amounts of energy. With AI’s growing environmental footprint, this could be a major step toward making powerful models greener and more accessible. Could this innovation reshape the future of AI, or are there hurdles left to clear before it becomes the industry standard?

More from the Alumni Education Gateway
Join the Alumni Education Gateway Email List​
We use cookies to ensure you get the best experience on our website. By using this site, you accept our use of cookies.