An A.I. Wrote This Story on Life in the Time of Coronavirus. It’s Eerie.
OneZero, as you may have seen in our publication’s tagline, is all about the undercurrents of the future — the technological and scientific forces that are carrying us into tomorrow. Artificial intelligence, of course, is a major part of that future: It shapes how we communicate, how businesses operate, how Instagram markets sketchy discount palazzo pants in your feed…
So, we decided to see if an A.I. could fill in as a OneZero columnist.
At the suggestion of OneZero senior writer Dave Gershgorn, we’ve been testing weekly short fiction pieces scribed by GPT-2, a text-generating artificial intelligence algorithm originally built by OpenAI. We give the algorithm a sentence from a real OneZero article (you’ll see them below in bold), and the algorithm iteratively generates what it thinks the next word should be. GPT-2 learned which words often follow other words by analyzing a dataset of 8 million webpages. We’re calling this project GPT-210. (Get it?)
To help the A.I. generate longer stories, sometimes we’ll insert the last full sentence it wrote back into the GPT-2 program and stitch the two parts together, but each word is algorithmically generated. Occasionally, we also snip out junk words — because the program is trained on news websites, sometimes it’ll spit out “ADVERTISEMENT ADVERTISEMENT ADVERTISEMENT” in the middle of the text, for example. (Hey, every writer needs an editor.)
These have appeared exclusively on our home page, and we wanted to give new readers a chance to enjoy entries they missed. Here are the first six GPT-210 entries: We’re fascinated by what the A.I. was able to come up with, and we bet you will be, too.
(Note that in some cases, GPT-2 scraped some proper nouns from the public internet. One of the stories also describes a scene involving gun violence — we’ve placed it at the end of the series in case you’d rather not read it, although it’s not graphic. To reiterate, the stories are completely fictional)
Life in the Time of Coronavirus
The pavement is lined with cars, and for some reason I feel as though I’m alone. These past few months, since the election, have been unending. My office sits in one of Trump’s most distressed locales: close enough to Manhattan, but far enough away to feel like an alien in my own country. I can’t find my kids on any social media or in any of my apps. Their phone alarms go off for imaginary visitors, and the playground is scruffy and the sidewalks are dirty. The Trump portrait on my wall, borrowed from a neighbor, looks shabby.
(Farther down the street, on the other side of the house, there is a huge painting of Obama looking at a phone.)
Along the community’s main artery, dozens of houses show signs of front-yard repairs: The curb has been cut in half and the basketball hoop had to be shifted to the other side. There are no flowers in sight.
Nostalgia for the iPod
Apple may have discontinued the last of the click-wheel iPods years ago, but Pichi is part of a growing community of tinkerers giving the devices new life. Instead of flipping through e-books or playing music on them, these people have taken control of the appliance and built their own OS called “BlurayBorg.”
Some have succeeded, while others have proven to be less than productive. The app uses a modified Android ROM called Tizen, and gives the iPad’s touch screen a leg-up on iOS with the ability to control the device’s webcams and GPS by simply tapping on them. And if that sounds like the opposite of what you would expect from a click-wheel computer, well, it sort of is.
The former is regarded as an archaic, flaky relic; the latter is slowly making a comeback as a semi-functional replacement.
‘These Algorithms Aren’t Magic’
“Irving wants to be the birthplace of facial recognition,” Marye Durrant, the city manager, told the Star Tribune. “When we sign this contract, we will be the first city to do it.”
But it turns out that at least one city in Colorado has already started using it.
Reno Police launched a pilot program in February. According to KOLO, a local television station, officers in the city’s narcotics section can now use a face-recognition software system to check drivers’ licenses against driver-record databases to arrest people they suspect of driving while the influence of drugs.
Police departments around the country are looking to replace old-school paperwork with algorithms to speed up investigation and prosecution of drug offenses, according to that Dallas Morning News article.
“These algorithms aren’t magic. They’re statistically based,” Anthony Murdock, an associate professor at the University of Pennsylvania’s School of Engineering and Applied Science, told the paper. “But these algorithms work very well when you’re looking at very old data, and that’s a lot of our criminal justice system.”
An A.I. Writes About A.I.
Unlike most facial recognition tools used by law enforcement, Clearview AI scrapes a vast amount of data from the open web, gathering photos for individuals across racial, socioeconomic, and geographic spectrums who are not usually targeted by law enforcement dragnets.
The A.I. also collects data on the identities and motives of individuals who are not flagged by law enforcement.
By identifying a randomly sampled group of people, Clearview AI has the potential to rapidly compile a substantial dataset of complete or partial “personality profiles.” By downloading a database of data, Clearview AI can quickly and accurately identify someone from a photo of their face, or that they wear a certain sweater, just from the raw photo, and it takes less than a second. It’s fast enough to distinguish whether a particular individual is wearing
In many countries, law enforcement would be prohibited from collecting photos of individuals in public without a warrant.
For its part, the RCMP has confirmed that facial recognition has been used in Canada, but insists that’s all it is. “This is strictly an automated component of Clearview,” says Corporal Paul Harkins, the chain of command for Toronto Police. “It does not engage in any non-revenue generating activity.”
E-waste is a global problem. A global problem, I mean, but a pretty global one. Nobody feels really uncomfortable talking about waste globally. People tend to put their finger on a particular issue when they think, “Well, that’s the one issue that is most important to me.” And we tend to talk a lot about issues that are global and immediate.
And it’s, well, it’s true that something needs to be done, but if you don’t make a concerted effort to clean up after yourselves, as a community, you’re going to run out of the resources to do it. Some people have trouble grasping that. I really don’t know why that is.
A Nightmare on Video
The video opens on a young girl with shiny hair and a big smile sitting at a table with a box in front of her. She has nothing and as she reaches for the package, an old man appears, stands on one foot, and then smiles as if to say “let me help you.” The old man walks to the girl, opens the box, and shows her what he’s brought. It is a box, with yellow tin foil wrapped around it. She opens it up, and inside she finds a couple of things. On top of the box is a big new toy, one she identifies as a “dinosaur.”
She shakes her head as she sits in silence.
When she opens it, the girl inside pulls a lamp from the box and lights it. Then she opens a mirror, and lets the light flood through it. A boy comes into the room and points a gun at her.
The guy shoots her and then runs out. The girl turns to the mirror and the metal bed of the mirror appears to be moved. She turns the other way and is surprised to see her family members coming to the door.
When she sees her parents, she runs to them. They ask where they’ve been.
You can read more about how computers learn human language here.