Mobile Logo in White

New funding expands development of AI tools for suicide-prevention training

 

A national effort to strengthen suicide-prevention skills among mental health clinicians is expanding, thanks to new funding to develop additional artificial intelligence–based training tools. The support, approved Oct. 9,  is provided through the Face the Fight initiative and backed by USAA, the Humana Foundation and Reach Resilience. It will fund two new AI programs focused on firearm-safety conversations and crisis-response planning, expanding ongoing work led by UT Health San Antonio and Rush University.

The projects build on the STRONG STAR Training Initiative, a national research network dedicated to improving prevention and treatment for psychological health issues affecting military service members, veterans and first responders. As part of that effort, researchers are evaluating how AI can help clinicians practice and refine the most difficult conversations involved in suicide prevention.

Training with Socratic questioning

The team’s current work centers on Socrates and Socrates Coach, AI platforms that allow therapists to rehearse Socratic questioning — a therapeutic technique used to help individuals examine rigid thinking patterns and consider new perspectives. It is a core component of many effective treatments for PTSD and other mental health conditions.

“Socratic questioning is one of the key mechanisms behind why treatments for PTSD and other mental health conditions work so well,” said David Rozek, PhD, ABPP, associate professor in the Department of Psychiatry and Behavioral Sciences and senior scientific adviser of Face the Fight. “When people feel trapped in their own thinking, it can feel impossible to move forward. Socratic questioning helps them break that cycle and see more realistic possibilities they couldn’t see before — and that can change everything.”

Enhancing traditional training with AI

In traditional therapist training, clinicians often attend a two-day workshop on new therapies followed by several months of consultation to support implementation — a process considered the gold standard. The AI-based approach adds another layer, with a virtual coach available around the clock to practice with, provide feedback and reinforce learning between sessions.

One AI-assisted training program, Socrates Coach, allows therapists to role-play scenarios with simulated individuals experiencing suicidal thoughts. Another AI-assisted training program, Socrates, flips the script, letting trainees adopt the patient role while the AI provides structured practice responses based on established therapeutic approaches. By reviewing how these practice conversations unfold, researchers can identify where interactions succeed, where they go wrong and how to refine both therapist training and AI-supported training tools themselves.

Rozek said these tools are especially useful for building confidence around conversations that clinicians sometimes feel uncertain initiating, including asking directly about suicide risk, discussing firearm safety or crafting crisis-response plans.

“Having those conversations can be intimidating, especially for newer clinicians,” he said. “AI provides a safe, low-pressure environment to practice and build confidence before they’re in the room with a real person.”

New tools for firearm safety and crisis planning

The newly funded projects expand that capability. One tool will train clinicians to start respectful, effective discussions about secure firearm storage, which is a critical area given that firearms are used in roughly 72% of veteran suicides and more than half of suicides nationally. The second tool will guide clinicians in developing crisis-response plans with patients, helping them manage overwhelming stress and improve problem-solving skills.

Early feedback from clinicians testing the existing tools has been overwhelmingly positive. Users report that the flexibility to train anytime, across a wide range of scenarios, makes the experience both accessible and realistic.

Maintaining safety

While using AI in mental health training does come with challenges, Rozek emphasized that sensitivity and safety are the top priority when developing this kind of technology.

“Strong safety guardrails that are reviewed and updated by clinical experts are built into the programs from the start and continuously refined,” he said. “Like any new training tool, it is not perfect yet. Socrates and Socrates Coach may occasionally miss nuances that an experienced clinician would catch, but the goal here isn’t to replace human training or supervision — it’s to strengthen it.”

Veteran focus with wide impact

Although Face the Fight is a veteran-focused initiative, its impact extends far beyond military populations. The training incorporates common experiences among veterans, but the programs are designed for community clinicians who serve a wide range of patients.

“Our military and veteran communities continue to drive important advances in suicide prevention,” Rozek said. “The progress we make with veterans and military personnel lifts the whole system. Every skill we strengthen, every tool we improve, helps clinicians provide better care to anyone who walks through their door.”

 

For the full story, visit Mission magazine online.



Share This Article!
Categories: