Training and running A.I. at scale requires server farms filled with thousands of computer servers. It takes a massive amount of electricity — and, in most cases, a lot of carbon emissions — to power these operations.
In June 2019, researchers from the University of Massachusetts, Amherst found that, for instance, training and running a natural language processing A.I. model — used to process and manipulate human speech and text — can emit more than 626,000 pounds of carbon dioxide. That’s almost five times the amount of carbon dioxide emitted by the average car during its lifetime. Now, a new paper proposes a way to reduce those emissions.
The paper, published by researchers from the Massachusetts Institute of Technology (MIT) in April, outlines a new technique for training and running a neural network, or a set of algorithms loosely modeled after the human brain that are used to perform natural language processing and interpret other types of data. They say their method uses about 1/1,300 of the carbon emissions it takes to train and run the neural networks being used today.
Black, Hispanic, and Latino People Care About Climate Change the Most, Survey Finds
New research shows that the people most affected by climate change are also the most concerned
“I was pretty surprised by the amount of CO2 emissions that modern deep neural networks [a specific type of neural networks] have to [use],” Song Han, PhD, assistant professor of electrical engineering and computer science at MIT, tells OneZero.
One reason for the high CO2 emissions is that while nearly every modern device has a computer chip running it — from refrigerators to smartphones to data center servers — these chips are all different, with varying computing power and use cases.
To get A.I. algorithms on each of these devices, software-makers have to build different versions of the same algorithm to handle…