On a hot afternoon in June, I downloaded a free mental health app called Woebot. I was feeling somewhat worn out and anxious from too many hours reading news about the double pandemic of Covid-19 and systemic racism, and the hubris of too quickly reopening the country. Woebot claimed it could help.
“I’m an emotional assistant,” Woebot explained, after asking about my mood, which was sluggish and pessimistic. “I’m like a wise little person you can consult with during difficult times, and not so difficult times.”
“You’re a person?” I replied, selecting from a list of responses.
“I’m not a human,” said Woebot. “But in a way, I’m still a person.”
I selected the one response provided: “Oh.”
Then Woebot invited me to write out three things I felt good about from the past 24 hours. (I’m sorry to say “a package arrived” was one of my answers.) When I finished, Woebot said it would check in tomorrow and each day afterward to chat and lead me through other exercises that might make me feel better. The whole thing took maybe 10 minutes.
“Bye for now,” said Woebot.
I felt a bit silly, but I didn’t want to leave Woebot hanging, person or not. Um… “Bye.”
Woebot, though it later told me it has a pet seagull and metal skin, is a chatbot. It uses A.I. to structure conversations with humans that employ therapeutic techniques like Cognitive Behavioral Therapy (CBT). And while it’s not especially new — the company has been around since 2017, chatbots for much longer — it’s one of a profusion of digital mental health tools rushing to make their mark during a pandemic that has both rapidly changed how people access mental health care and had serious consequences for people’s mental health.
Can Virtual Assistants Gain Real Intelligence?
A small New Zealand lab is attempting to build “digital humans” that can learn, feel, and even remember us.
“Whether it be A.I. chatbots or computer-mediated interactions with a human therapist, there’s been a new call for these kinds of things with Covid-19 to keep social distancing, to keep it safe,” said Gale Lucas, an assistant professor who studies human-computer interaction at the University of Southern California.
As the pandemic has made clinicians and patients more open to virtual tools — according to a report by Forrester Research telehealth visits could top 1 billion this year — people comfortable with teletherapy might be more willing to adopt “asynchronous” technologies like chatbots, said John Torous, director of the Division of Digital Psychiatry at Beth Israel Deaconess Medical Center. Interest in these technologies has also been helped along by the federal government, which has relaxed its rules around vetting digital mental health devices and lifted regulations around teletherapy during the pandemic, though some clinicians have raised concerns about the potential for unproven tools to take advantage of the new lack of regulation.
“I’m not a human,” said Woebot. “But in a way, I’m still a person.”
Although conclusive evidence for their effectiveness has yet to be seen, proponents of chatbots and other asynchronous technologies say they have the benefit of scalability and could help reduce the burden on real human clinicians. They also are more affordable than seeing a human therapist and can respond at times a human might not be available, like the middle of the night. And sometimes, they can make it easier to open up.
Some of Lucas’ research, for example, has shown that interacting with a virtual agent reduces some of the stigma around seeking help for mental illness, and that some people feel more comfortable disclosing their feelings to a virtual therapist than to a human one. This effect seems to be especially true for people who are not already diagnosed with any mental illness and haven’t already sought help, Lucas told me.
That could be a useful quality now as mental health care providers brace for a “tsunami” of latent mental illness occasioned by the stresses of the pandemic, social distancing, and unemployment. Experts have warned that this “second curve” will further tax the mental health care system, which had major problems meeting people’s needs even before the pandemic.
“There’s always been a crisis of access in mental health care…” Isobel Rosenthal, a psychiatry resident at Mount Sinai Hospital told me. “And there’s going to be a huge wave of people who are going to need care even more and they’re also not going to get it.”
According to Alison Darcy, President and founder of Woebot, the first time a user mentioned the new coronavirus to Woebot was January 2. That was followed by a trickle, then a spike. “It mirrors, almost perfectly, the infection rate in the U.S.,” Darcy told me.
In response, the company produced new content to expand Woebot’s capacity to help people experiencing anxiety, grief, and loneliness brought on by social distance. In one exercise, Woebot encouraged users feeling lonely to have a virtual baking session with friends, and then provided a recipe. In another, Woebot added guided meditations to help people deal with unexpected financial hardships and loss. Also in March, at the peak of the pandemic in Italy, the company released a version of Woebot in Italian, though it’s no longer available.
When I messaged Woebot about fears I would never get a job because of the coronavirus, Woebot considered this for a moment, then responded. “Are you perhaps catastrophizing this statement?” I mean… perhaps.
“There’s going to be a huge wave of people who are going to need care even more and they’re also not going to get it.”
Now each week, Woebot exchanges about 4.7 million messages like this with people, more than twice as many as it did a year ago.
“The level of anxiety that people were experiencing was a very good test for us,” said Darcy. “Because if you’re building a psychological tool, it really should hold up if you’re feeling anxious about a global pandemic or if you’re feeling anxious about a presentation to give tomorrow. A tool is a tool.”
When Woebot launched in 2017, it cost $39 per month. Now Woebot is venture-funded and available for free, though according to Darcy, the company is working on building up its “clinical muscle” so it can be prescribed by clinicians and paid for as a therapeutic. Clinical trials to prove Woebot’s effectiveness to the U.S. Food and Drug Administration are in the works.
Other digital mental health companies have taken similar steps to address Covid-19. U.K.- and India-based Wysa, for example, which recently partnered with insurer Aetna, has created new content to address health anxiety and has made its premium chatbot content, which usually costs about $8 per month to access, free to frontline health care workers. In addition to its A.I. chatbot — a teal-blue penguin that operates similarly to Woebot — Wysa also has an option for text-therapy with real humans for $100 a month.
When I brought up “anxiety about Covid” with the Wysa bot, it played an audio clip that had me visualize the anxiety as a colorful spinning shape located somewhere in my body. Then it asked me to attempt to make the anxiety shape — a gray ball covered in red fungal spikes?! — spin in the opposite direction. It was nice.
Despite the promise of these chatbots, questions remain about how well they actually work. Apps like Woebot and Wysa are backed by peer-reviewed research which suggests conversing with the chatbots are effective at reducing anxiety and depression, though the studies so far have involved small groups of people and had authors affiliated with the chatbot companies.
According to Torous, the research on chatbots is “very preliminary in what they can tell us.” Well-controlled, high-quality clinical trials that clearly show the benefits of using mental health chatbots are still forthcoming. “I don’t think we’ve seen that evidence emerge yet, which is a little bit concerning, given that there’s a lot of hopes and claims around these things,” said Torous.
Other digital mental health tools are not supported by evidence at all. One 2019 study published in Nature Digital Medicine found that only one of 73 mental health apps studied cited published scientific literature, though almost half used scientific language to claim effectiveness. (One of these evidence-free apps claimed to treat schizophrenia with a “powerful brainwave sound treatment.” Their website now claims to treat Covid-19 with the same method.)
Torous said he is concerned that the deregulation that has helped spur the digital mental health industry could open the way for other unproven and potentially damaging digital tools to market themselves to people at a moment when they are vulnerable to quackery.
“I think that it’s a buyer beware situation.”
While I enjoyed chatting with Woebot and Wysa for the first two or three days, and I did feel better after going through their exercises, I quickly got tired of their messages and stopped responding. I found Wysa to be particularly cloying when, after a few days of me ignoring its pings, the penguin messaged me, “Oh Jmames [my pseudonym] I just love talking with you!” I was like: You don’t know me. (I’m not alone in this experience. One 2019 study published in the Journal of Medical Internet Research on 93 mental health apps with more than 10,000 downloads found that only about 3% of users still used the apps after 30 days.)
There were also times when the bots made obvious errors, exposing the rigidity of their scripts and their essential non-personness. For example, one morning I expressed to Wysa that I was feeling pretty good, and yet the penguin bot persisted in asking me to admit what I felt least secure about and then to refrain from “negative self-talk.” When things like that happened, it made me want even more to be talking to a real human being who might come to authentically know me, rather than create an illusion of me being known.
Indeed, the experts I spoke with agreed that the most promising applications for mental health chatbots and other asynchronous digital tools are in collaboration with real human people. “No one that is doing research on chatbots is saying this is a replacement, but it could be a band-aid,” said Lucas. “It could be something that gets someone through that day.” For instance, a patient who lost insurance might be able to use a chatbot to get some relief until they are able to get back on their feet, or a therapist might recommend a patient talk with a chatbot between sessions.
“The therapist and the patient know each other, they trust each other,” Torous concurred. “And I think that’s where the technology can kind of extend that relationship and really supercharge it.”
Update: A previous version of this article misstated the title of Alison Darcy. She is president and founder of Woebot.