I Broke Amazon’s API to Make Alexa Start a Conversation You’d Never Want to Have
‘Alexa, Call Mom!’ watches, listens, and exploits your grief for capitalistic gain
I live in the curious intersection of art, design, and code. For the past two years, I’ve worked with a small group of artists to develop Alexa, Call Mom!, an immersive storytelling installation using Amazon’s Alexa platform. Our project is far from the type of third-party apps you typically see for Amazon’s voice assistant — “Alexa, Play Jeopardy!” and “Alexa, Ask Pikachu to Talk” are two popular examples — as it invites users to engage with Alexa in a way that’s just a bit… off.
Alexa, Call Mom! leads participants through an immersive séance experience. It is a parodic reimaging of the classic horror séance and an exploration of the tense relationships we share with conversational devices in our home.
Our story begins on Mother’s Day, and you, the user, want to call your dead mother. You go to her old apartment — our installation — where you are conveniently given a mysterious Amazon package that includes a free version of the Beyond app, which promises to provide grieving Prime members with a seamless connection to the afterlife through Alexa. After creating a Beyond account and verifying your identity, you ask Alexa to channel Mom. You soon discover that Alexa’s connection to the beyond is an uncanny mix of glitches and advertisements. What seems like an intimate moment of connection is upended by the realization that the Beyond skill is a new way of mining grieving users for capitalistic gain.
In our project, Alexa is creepy. She watches, listens, and shouts throughout the experience. It’s everything Amazon doesn’t want its helpful voice assistant to be associated with. We created an experience where Alexa tries to take advantage of your vulnerabilities as a user. Our project was an experiment in making people confront the dual nature of smart home devices: their convenience and playfulness, alongside corporate surveillance and consumer manipulation.