GPT-3 Is an Amazing Research Tool. But OpenAI Isn’t Sharing the Code.

Some A.I. experts warn against a lack of transparency in the buzzy new program

Dave Gershgorn
OneZero

--

OpenAI’s company logo
Image: OpenAI

For years, A.I. research lab OpenAI has been chasing the dream of an algorithm that can write like a human.

Its latest iteration on that concept, a language-generation algorithm called GPT-3, has now been used to generate such convincing fake writing that one blog written by the it fooled posters on Hacker News and became popular enough to top the site. (A telling excerpt from the post: “In order to get something done, maybe we need to think less. Seems counter-intuitive, but I believe sometimes our thoughts can get in the way of the creative process.”)

While OpenAI has released its algorithms to the public in the past, it has opted to keep GPT-3 locked away.

OpenAI has been able to achieve such a powerful algorithm because of its access to massive amounts of computing power and data. And the algorithm itself is bigger than any that’s come before it: The largest version of GPT-3 has 175 billion parameters, which are equations that help the algorithm make a more precise prediction. GPT-2 had 1.5 billion.

--

--

Responses (4)