The Future of A.I. Isn’t Quite Human

On the red carpet or the esports arena, the next generation of celebrities could be artificial

Illustration: Blake Kathryn

NNothing about Shudu is real, exactly. She doesn’t have a physical form, and until now, she has existed only on our screens. She lives, so to speak, mostly on Instagram, where she models clothes virtually tailored to her computer-generated body.

But last month, Shudu stepped beyond our phones and onto the red carpet at the British Academy of Film and Television Arts Awards (BAFTAs). Unfazed by flashing cameras and rendered in holographic form, Shudu, wearing a glittering dress designed by Swarovski, posed with celebrities like Regina King and Salma Hayek.

Shudu is a kind of idealized representation of a woman who would work in the fashion industry, a calculated fabrication from her creator, photographer Cameron James-Wilson. (Technically, Shudu doesn’t have a gender. She’s the image of a black woman created by a white man, a visual media product with about as much of a gender identity as a glossy magazine page. Still, we’re using female pronouns in the story for clarity.) At the BAFTAs, Shudu became the visual representation of an A.I. program that parsed celebrities’ clothes and scanned a database to find affordable alternatives. Folks at home could then interact with a Shudu chatbot online to talk fashion.

At first glance, an A.I. brought to life on the red carpet may feel jarring. But A.I. is already operating in many aspects of our lives: It controls your Facebook news feed, it helps make your salad, and it opposes you in video games. And while a fleet of Protoss carriers gliding across a choke point in Starcraft II may appear less “real” than Shudu in her gown on a real red carpet — or the virtual avatars created by Facebook and spotlighted in a Wired feature last week — cutting-edge work happening behind the scenes in these virtual worlds may actually say quite a bit more about an emerging universe of the almost-human, where the line between person and machine blurs.

After all, Google wouldn’t spend upwards of $500 million on nothing. The company’s DeepMind property uses advanced algorithmic learning to mimic and surpass human play style in games, but that’s nothing compared to what’s coming.

“This is not going away,” Morgan Young, the CEO and co-founder of Quantum Capture who worked on Shudu’s BAFTAs project, tells OneZero. “This is just the beginning of how powerful characters can be when they’re combined with A.I.”

Up to this point, there’s always been a human element behind these artificial personalities. A human team posts to Shudu’s Instagram accounts, and there were limits to her interactivity at the BAFTAs. The same goes for other digital characters, like Hatsune Miku, a Japanese “vocaloid” pop star that has sold out real-life arenas, even in the United States.

While Shudu may never pass as a living, breathing person, an A.I. on the other side of a competitive video game could be a different story.

That’s because it’s not easy to give a robot or A.I. a genuine, humanlike personality. Take the robot Sophia from Hanson Robotics. She’s programmed to acknowledge, understand, and interact with humans. But despite Hanson’s compelling marketing — Sophia was even granted Saudi citizenship — the robot’s conversation is stilted and feels programmed. Try to watch her interview on Today without cringing in secondhand embarrassment.

Meanwhile, DeepMind is making advances that are convincing enough to remind onlookers of human ways of thinking. One of the company’s top A.I. programs, AlphaStar, has defeated professional StarCraft II players at their own game. Sure, A.I. has played — and beaten — people at other competitive games before, but StarCraft II is a different beast. It may be a video game, but its complex systems require players to manage both short- and long-term goals while demonstrating the pure mechanical skill necessary to outplay an opponent. To play the game well requires overseeing entire armies, balancing resources, and managing multiple battlefronts at the same time. DeepMind has said that StarCraft II represents a “grand challenge” in A.I. research, which is why the company took it on.

Gaming itself is a multibillion-dollar industry with its own class of influencers, like the Overwatch League’s pro gamers or Twitch streamer Tyler “Ninja” Blevins. And AlphaStar’s adaptive learning technique displayed its own convincing gameplay style — not quite human, but not clearly artificial either. While Shudu may never pass as a living, breathing person, an A.I. on the other side of a competitive video game could be a different story: If it plays like a human and presents like a human, a computer-generated esports influencer may just be the next big thing.

AAlphaStar learned how to play StarCraft II using a “deep neural network” that was trained with both supervised and reinforcement learning methods. The supervised learning process involves taking information from StarCraft II replays previously released by its developer, Blizzard Entertainment. But that got AlphaStar only so far — it could beat StarCraft II’s elite-ranked internal A.I., which made it akin to a gold-level human player. (Gold is where the majority of StarCraft II players rank, but it’s pretty low on the overall tier rankings: The highest rank is grandmaster, four tiers higher than gold, which is preceded, naturally, by silver and bronze.)

Later, DeepMind used this information to create a new learning process that simulated how a human player would get better at StarCraft II — by playing on the StarCraft II ladder. Of course, AlphaStar’s ladder was populated by earlier versions of the program that built upon the progress made by other programs, called “agents,” in the training league. Each of these agents had the equivalent of up to 200 years of “real-time StarCraft play,” according to DeepMind.

AlphaStar demonstrates an A.I.’s ability to adapt and manage goals… In that regard, it’s already passing as a human.

It’s a level of training that no human player could even dream of acquiring, which makes AlphaStar particularly daunting. In January, the program beat two professional players — Grzegorz “MaNa” Komincz and Dario “TLO” Wünsch — in a series of 10 show matches, using a version of the game that allowed the program to “see” more of the map than a human could.

Typically in StarCraft II, players have to move a camera (and their attention) to various parts of the map; you can’t see everything in the game at once. AlphaStar played on what DeepMind called a “zoomed out” version of the map, so its vision wasn’t restricted to what’s on the screen at any given moment. In a later version of AlphaStar, MaNa was able to beat a version of the program that was forced to play as a human does — having to move a camera around the map.

Technically, the goal for AlphaStar wasn’t for the program to play like a human. Though AlphaStar recorded lower actions per minute than human players, its mechanical skill stunned commentators who broadcast the match as it “microed” units — controlling pieces of a StarCraft II army individually rather than as a group — beyond human capability.

In other words, AlphaStar isn’t necessarily trying to pass as a human, just beat one.

“Systems such as DeepMind’s AlphaStar aren’t designed to mimic a human expert’s strategy or mental model,” Ayanna Howard, an A.I. and robotics expert at Georgia Tech’s School of Interactive Computing, tells OneZero. “Rather, it discovers alternative strategies based on its own abilities, which are designed around its massive processing capabilities and speed of computations.”

It’s a stunning achievement on its own, but it represents something much larger than an A.I. merely being really good at a video game. AlphaStar demonstrates an A.I.’s ability to adapt and manage goals — two activities that are also pretty important in the game called life. In a match of StarCraft II, AlphaStar presents as a human player well enough; both MaNa and TLO commented that the program had a “very human” style of play, just with superhuman mechanics. In that regard, it’s already passing as a human.

“[It’s] the ability to adapt to unexpected situations in real time, to learn from imperfect information, to balance short-term wins against longer-term higher payoffs, to name just a few of the skills necessary to win in games such as StarCraft II,” Howard says.

Those abilities have applications beyond just playing StarCraft really, really well — which would explain Google’s investment. “Practical applications that require these skills would be, for example, in learning conversational strategies across different cultures for a customer service chatbot or learning how to grasp new household objects for a home-care robot,” Howard says.

As demonstrated by AlphaStar, passing as a human doesn’t require a physical form. You don’t need a robot that looks like Sophia or a model like Shudu. According to Howard, “We, as humans, can develop relationships with others through texting, talking, and virtual communications without ever having ‘seen’ the other human’s face.”

But despite AlphaStar’s convincing game against MaNa, the A.I. left out one of the smallest but most inherently human qualities of the game: “GG.” That acronym stands for “good game,” but it’s more than just a compliment to your opponent. A “GG” message signifies that the game is over; once past the point of no return, an enemy player will concede to the other by typing those letters and leaving the game.

A digital esports influencer like Shudu could go past Instagram. Whereas Shudu met fans on the red carpet, an esports influencer meets gamers where they are: in video games.

AlphaStar can’t do this. (Not yet, anyway.) To put it in dramatic terms, the program can’t self-destruct. Instead, MaNa, who beat a version of AlphaStar just once in the January 24 show match, had to complete the tedious task of destroying every building on the map to earn the win. AlphaStar fought until it could fight no more.

That’s what StarCraft II players would call “BM,” or bad mannered, because it drags the game out pointlessly. Of course, AlphaStar doesn’t know it’s BM — it’s just playing the game the only way it knows how. Tweaking this kind of thing will be imperative if A.I. like AlphaStar are to move past simple programming and into the realm of digital personality.

It’s the kind of work Quantum Capture’s Young is doing with his company. A former Ubisoft game developer, Young started Quantum Capture to focus on creating high-quality interactive digital characters using A.I. Characters span industries, including health and medicine, hospitality, and fashion retail.

Of course, there’s also the entertainment aspect. Gaming influencers like Ninja or Felix “PewDiePie” Kjellberg rival traditional celebrity reach; you can see why a company might want a digital influencer of its own — and that it owns — to represent a game.

Young can foresee a world where we celebrate and play against digital esports players, essentially advanced versions of current A.I. Maybe it’s something that happens in virtual reality, giving players the chance to interact with a digital esports pro — aspirational but unattainable, a superstar whose limits are beyond that of a human. A digital esports influencer like Shudu could go past Instagram. Whereas Shudu met fans on the red carpet, an esports influencer meets gamers where they are: in video games.

“The ability to interact with people in VR is really a realistic, powerful thing,” Young says. “Having a deeper understanding of how people interact is something DeepMind has excelled at, and those interactions are more believable when they’re experiencing it with CGI, for instance.”

Esports fans can already meet up in virtual spaces. With Sansar, a company creating social VR spaces, esports organization Fnatic has its Meta BUNKR, a digital hangout for fans to watch streamed matches and play games. Dramatic lighting makes it feel like a nightclub, with neon logos marking up the space. It’s not hard to imagine a virtual reality esports player using a space like this to meet with its own dedicated fan base.

A digital esports player may not feel weird or foreign in just a few years. Young says digital characters could become like companions. Like Siri, but with a face — and hopefully with much better A.I.

“These intelligent digital avatars are going to be more in tune with life than ever before,” Young says. “We’re going to start relying on them the same way everyone relies on Google for their email and traffic [alerts].”

At the very least, they could teach us a thing or two about gaming.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store