Virtual Therapists Help Veterans Open Up About PTSD

An artificially intelligent therapist named Ellie helps members of the military open up about their mental health.
This image may contain Text Handwriting Calligraphy and Label
Hotlittlepotato

When US troops return home from a tour of duty, each person finds their own way to resume their daily lives. But they also, every one, complete a written survey called the Post-Deployment Health Assessment. It’s designed to evaluate service members’ psychiatric health and ferret out symptoms of conditions like depression and post-traumatic stress, so common among veterans.

But the survey, designed to give the military insight into the mental health of its personnel, can wind up distorting it. Thing is, the PDHA isn’t anonymous, and the results go on service members’ records—which can deter them from opening up. Anonymous, paper-based surveys could help, but you can’t establish a good rapport with a series of yes/no exam questions. Veterans need somebody who can help. Somebody who can carry their secrets confidentially, and without judgement. Somebody they can trust.

Or, perhaps, something.

"People are very open to feeling connected to things that aren't people," says Gale Lucas, a psychologist at USC's Institute for Creative Technologies and first author of a new, Darpa-funded study that finds soldiers are more likely to divulge symptoms of PTSD to a virtual interviewer—an artificially intelligent avatar, rendered in 3-D on a television screen—than in existing post-deployment health surveys. The findings, which appear in the latest issue of the journal Frontiers in Robotics and AI, suggest that virtual interviewers could prove to be even better than human therapists at helping soldiers open up about their mental health.

“Most people would assume these things are in conflict with each other—that you can’t have anonymity and rapport at the same time,” Lucas says. But a virtual interviewer can offer both. A few years ago, Lucas and her colleagues paired hundreds of test subjects with Ellie, an embodied AI designed to engage test subjects in verbal interviews. Participants sat alone in a room with the virtual therapist, who appeared and communicated via a television screen. Ellie would begin with general questions like “Where are you from?” to build rapport; gradually proceed to more sensitive, clinical queries, like “How easy is it for you to get a good night’s sleep”; and finish with mood-boosting questions, like “What are you most proud of?”

But Ellie is no brainless bot. Unlike, say, Eliza, the 1960s computer program designed to respond to users with non-directional questions, Ellie uses machine vision to interpret test subjects’ verbal and facial cues and respond supportively. For example, Ellie not only knows how to perform sympathetic gestures, like nodding, smiling, or quietly uttering “mhm” when listening to a sensitive story—she knows when to perform them. Psychologists call these kinds of sounds and gestures backchannels. When interspersed appropriately throughout an interaction, they can help build rapport and elicit sharing.

Ellie's capacity for subtle and supportive engagement reveals fascinating things about humans, and how we choose to guard our secrets. Lucas and her colleagues told half their test subjects they’d be interacting anonymously with a virtual therapist. The other half were deceived into thinking there was a person pulling Ellie's strings. In the end, the participants who thought they were talking with the virtual therapist alone were significantly more likely to open up. For civilians, at least, just removing the idea of human presence led to more fruitful clinical sessions.

To see if Ellie could help soldiers reveal their PTSD symptoms, Lucas and her colleagues recruited soldiers recently returned from Afghanistan. As in the previous study, Ellie began each interview with rapport-building questions and ended with positive, mood-boosting ones. But this time, Ellie’s clinical questions were geared toward symptoms of PTSD, specifically. Questions like:

Can you tell me about any bad dreams you’ve had about your experiences, or times when thoughts or memories just keep going through your head when you wish they wouldn’t?

In the end, test subjects reported significantly more PTSD symptoms in their interviews with Ellie than they did on their official PDHA surveys. But the service members also divulged more to Ellie than they did on an anonymized version of the PDHA. That suggests a system like Ellie could provide a real service to members of the military. "Getting people to admit they have symptoms is an important step in helping them realize they’re at risk—and getting them treatment," Lucas says. "With a virtual interviewer, you don’t have to ruin your career to begin seeking help."

USC Institute for Creative Technologies

When I ask experts unaffiliated with Lucas' study whether virtual therapists have a role to play in the future of clinical psychology, their answers are unambiguous: "Certainly, absolutely," says Lynn Bufka, who oversees research efforts related to psychological practice and policy at the American Psychological Association. "My concern is: OK, we have the technology for a virtual interviewer. They're identifying people who have distress. Now that we've identified their symptoms, how do we ensure they get the treatment they need?"

The US government already struggles to deliver psychological aid to its veterans. According to a study conducted by the RAND Center for Military Health Policy Research, fewer than half of returning veterans requiring mental health services receive treatment. The system is already backlogged; virtual therapists like Ellie could wind up adding to the pile up.

But they could also relieve some of the burden. "If virtual interviewers can consistently perform as well or better than a human interviewer, it might help with our treatment capacity down the road," Bufka says. "Because now, human therapists who might have spent a lot of their time on the assessment side can spend more time on the intervention and treatment side."

Lucas says that's exactly what she and her team have in mind. "We don’t want to replace therapists," she says. "We want to get more therapists to the people who need them." Already she and her colleagues are conducting experiments that compare Ellie against human interviewers, to pick apart which aspects of anonymity and rapport-building are most effective at eliciting openness from interview subjects.

One of the more fascinating studies will compare Ellie's performance to that of a human therapist who, like a priest hearing confession, cannot see the test subject. The confessional model would appear to resolve the same anonymity/rapport paradox as Ellie—and to a certain extent it does. But a blind interviewer can't read gestures and facial expressions the way Ellie can, limiting their ability to read emotional cues and leverage those observations to greater therapeutic effect.

This, says Lucas, is Ellie's most promising feature. "Again and again, I'm seeing the power of the virtual agents to tease out information that traditional methods just don't, and that power seems to stem from the fact that it's just a computer," she says. That's the thing about AI therapists like Ellie: They can help you, but they can't judge you.