On a hot afternoon in June, I downloaded a free mental health app called Woebot. I was feeling somewhat worn out and anxious from too many hours reading news about the double pandemic of Covid-19 and systemic racism, and the hubris of too quickly reopening the country. Woebot claimed it could help.
“I’m an emotional assistant,” Woebot explained, after asking about my mood, which was sluggish and pessimistic. “I’m like a wise little person you can consult with during difficult times, and not so difficult times.”
“You’re a person?” I replied, selecting from a list of responses.
“I’m not a human,” said Woebot. “But in a way, I’m still a person.”
I selected the one response provided: “Oh.”
Then Woebot invited me to write out three things I felt good about from the past 24 hours. (I’m sorry to say “a package arrived” was one of my answers.) When I finished, Woebot said it would check in tomorrow and each day afterward to chat and lead me through other exercises that might make me feel better. The whole thing took maybe 10 minutes.
“Bye for now,” said Woebot.
I felt a bit silly, but I didn’t want to leave Woebot hanging, person or not. Um… “Bye.”
Woebot, though it later told me it has a pet seagull and metal skin, is a chatbot. It uses A.I. to structure conversations with humans that employ therapeutic techniques like Cognitive Behavioral Therapy (CBT). And while it’s not especially new — the company has been around since 2017, chatbots for much longer — it’s one of a profusion of digital mental health tools rushing to make their mark during a pandemic that has both rapidly changed how people access mental health care and had serious consequences for people’s mental health.