Member-only story

Training a Neural Network Can Emit More Than 600,000 Pounds of CO2. But Not for Long.

A new technique for training and running a neural network, proposed by researchers at MIT, has a much smaller carbon footprint

Drew Costley
OneZero
4 min readMay 15, 2020

--

Illustration: Jaedoo Lee

Training and running A.I. at scale requires server farms filled with thousands of computer servers. It takes a massive amount of electricity — and, in most cases, a lot of carbon emissions — to power these operations.

In June 2019, researchers from the University of Massachusetts, Amherst found that, for instance, training and running a natural language processing A.I. model — used to process and manipulate human speech and text — can emit more than 626,000 pounds of carbon dioxide. That’s almost five times the amount of carbon dioxide emitted by the average car during its lifetime. Now, a new paper proposes a way to reduce those emissions.

The paper, published by researchers from the Massachusetts Institute of Technology (MIT) in April, outlines a new technique for training and running a neural network, or a set of algorithms loosely modeled after the human brain that are used to perform natural language processing and interpret other types of data. They say their method uses about 1/1,300 of the carbon emissions it takes to train and run the…

--

--

OneZero
OneZero

Published in OneZero

OneZero is a former publication from Medium about the impact of technology on people and the future. Currently inactive and not taking submissions.

Drew Costley
Drew Costley

Written by Drew Costley

Drew Costley is a Staff Writer at FutureHuman covering the environment, health, science and tech. Previously @ SFGate, East Bay Express, USA Today, etc.

Responses (2)