OneZero
Published in

OneZero

Training a Neural Network Can Emit More Than 600,000 Pounds of CO2. But Not for Long.

A new technique for training and running a neural network, proposed by researchers at MIT, has a much smaller carbon footprint

Illustration: Jaedoo Lee

Training and running A.I. at scale requires server farms filled with thousands of computer servers. It takes a…

--

--

--

The undercurrents of the future. A publication from Medium about technology and people.

Recommended from Medium

CES 2018 highlights — new innovations in AI

A Mathematical Approach to Constraining Neural Abstraction and the Mechanisms Needed to Scale to…

PyTorch / XLA Library Reaches General Availability on Google Cloud

The 5 Dangers of Artificial Intelligence

Google Walkout for real Change is standing with Dr.

A (Brief) Visual Guide Through the History of AI

BioComputers: Physical Neural Nets

NFTs Short Stories (Ep. 6) — Is NFT art?

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Drew Costley

Drew Costley

Drew Costley is a Staff Writer at FutureHuman covering the environment, health, science and tech. Previously @ SFGate, East Bay Express, USA Today, etc.

More from Medium

Deep Learning Innovations in Drug Discovery

Decoding the Language of Biology: Using Deep Learning to Understand the World Within Us.

2022 Gacha and Event History: Illusion Connect

Synthesizing fMRI scans using EGG data to detect Alzheimer’s and other brain diseases