OpenAI Wants to Move Slow and Not Break Anything

The Sam Altman-led company is gradually rolling out more and more powerful versions of its A.I. to study how it’s misused

Dave Gershgorn
OneZero
Published in
4 min readAug 21, 2019

--

Credit: OpenAI

InIn February, artificial intelligence research firm OpenAI announced that it had created four versions of an algorithm that could generate text that was comprehensible to humans.

The A.I. analyzed 8 million examples of news articles and writing on websites and not only figured out the rules of the English language, but how to string together words and sentences in a way that gave them meaning.

The resulting algorithm doesn’t take poetic license, of course. Each generated word is just a mathematical approximation of whatever the program thinks should come next based on every other word it’s seen. (For example, type “it was the best of times” and it will nearly always respond with “it was the worst of times.”) But the sentences make sense. The algorithm has analyzed the essence of human writing and recreated the underlying connective tissue.

OpenAI chose to only release the smallest and least powerful version of the algorithm following its announcement. That version has 124 million parameters — moving pieces in the artificial neural network that generates the text. The more parameters, the more relationships the algorithm can gauge between words. OpenAI said that the potential for misuse was too high to just drop their highest performing algorithm on GitHub. A few months passed without a catastrophic incident, so in May, OpenAI released its medium version, GPT-2, with 334M parameters.

And on Tuesday, OpenAI released its third and second-largest iteration of the A.I. The artificial neural network has 774M parameters, more than six times as many as the first version the company released. This means that researchers using the algorithm to make generative text suggestion apps, chatbots, or even auto-complete for code now have a better algorithm to work with.

The community around this technology moves incredibly fast: A website that made it easy to generate text with the GPT-2 algorithm has already updated to the 774M parameter model.

If you make something that can be used as a…

--

--

Dave Gershgorn
OneZero

Senior Writer at OneZero covering surveillance, facial recognition, DIY tech, and artificial intelligence. Previously: Qz, PopSci, and NYTimes.