OneZero
Published in

OneZero

Training a Neural Network Can Emit More Than 600,000 Pounds of CO2. But Not for Long.

A new technique for training and running a neural network, proposed by researchers at MIT, has a much smaller carbon footprint

Illustration: Jaedoo Lee

Training and running A.I. at scale requires server farms filled with thousands of computer servers. It takes a…

--

--

--

The undercurrents of the future. A publication from Medium about technology and people.

Recommended from Medium

USC & Amazon ‘SLADE’ Self-Training Framework Uses Unlabelled Data to Improve Information Retrieval

AI Based HR technology

Artificial Intelligence on the way of Translators: Like it or Lump it!

6 ERP PREDICTIONS FOR NEXT 10 YEARS

How Can New Deep Learning Initiatives Overcome Challenges in Robotics?

Yoshua Bengio Team Challenges the Task-Diversity Paradigm in Meta-Learning

AI Is Personalizing Hotel Services

AI Fools’ Day: Tech Giants’ Top Pranks

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Drew Costley

Drew Costley

Drew Costley is a Staff Writer at FutureHuman covering the environment, health, science and tech. Previously @ SFGate, East Bay Express, USA Today, etc.

More from Medium

Math models show why Russian generals keep getting killed

Dove Satellites: A Case of Smart Scale-Up

D4S Sunday Briefing #149

What Is Artificial Intelligence?