Training a Neural Network Can Emit More Than 600,000 Pounds of CO2. But Not for Long.

A new technique for training and running a neural network, proposed by researchers at MIT, has a much smaller carbon footprint

Drew Costley
OneZero

--

Illustration: Jaedoo Lee

Training and running A.I. at scale requires server farms filled with thousands of computer servers. It takes a massive amount of electricity — and, in most cases, a lot of carbon emissions — to power these operations.

In June 2019, researchers from the University of Massachusetts, Amherst found that, for instance, training and running a natural language processing A.I. model — used to process and manipulate human speech and text — can emit more than 626,000 pounds of carbon dioxide. That’s almost five times the amount of carbon dioxide emitted by the average car during its lifetime. Now, a new paper proposes a way to reduce those emissions.

The paper, published by researchers from the Massachusetts Institute of Technology (MIT) in April, outlines a new technique for training and running a neural network, or a set of algorithms loosely modeled after the human brain that are used to perform natural language processing and interpret other types of data. They say their method uses about 1/1,300 of the carbon emissions it takes to train and run the…

--

--

Drew Costley
OneZero

Drew Costley is a Staff Writer at FutureHuman covering the environment, health, science and tech. Previously @ SFGate, East Bay Express, USA Today, etc.