Although the Department of Veterans Affairs has adopted some artificial intelligence capabilities to better identify veterans at risk of self-harm, VA officials said these technologies represent only one part of their suicide prevention strategy and are not designed to replace human interventions.
VA’s 2024 AI use case inventory included 227 examples of the emerging capabilities being used or implemented across its operations, with these applications ranging from AI-enabled devices to an on-network generative chatbot for department personnel. Four of these use cases were also focused, in whole or large part, on identifying and assisting veterans found to be at a heightened risk of self-harm.
Suicide prevention has been a major priority within VA for more than two decades, with the department working over that period to significantly enhance the care and services it provides to at-risk veterans.
But veteran suicide statistics have remained alarmingly high; over 140,000 veterans have taken their lives since 2001, with VA estimated that 6,407 died by suicide in 2022 alone. Some organizations have also found these reported figures to be a drastic undercount of the total number of veteran suicides.
One of VA’s AI-powered efforts to better reach veterans at high risk of self-harm — the Recovery Engagement and Coordination for Health-Veteran Enhanced Treatment, or REACH VET, program — initially launched in 2017 and scans the department’s electronic health records to identify retired servicemembers in the top 0.1% tier of suicide risk.
The model uses machine learning — which is a subset of AI that analyzes data to locate patterns and make decisions or predictions — to identify specific variables across veterans’ records that have been linked to a heightened suicide risk.
During a House Veterans Affairs’ Technology Modernization Subcommittee hearing on Monday, Charles Worthington — VA’s chief technology officer and chief AI officer — told lawmakers that REACH VET “has used AI algorithms to identify over 130,000 veterans at elevated risk, improving outpatient care and reducing suicide attempts.”
VA officials also confirmed that the department has launched a 2.0 model of REACH VET, which includes new risk factors such as military sexual trauma and intimate partner violence.
Evan Carey, acting director of VA’s National Artificial Intelligence Institute, told lawmakers that the updated model recently went into effect “to ensure it has ongoing high performance of identification of veterans at the highest risk quartiles.”
The launch of the revised version also came after The Fuller Project previously reported that REACH VET’s algorithm considered being a white male a greater indicator of potential self-harm than other factors that primarily or fully affect women.
Even with the department rolling out the 2.0 model, Carey said REACH VET is just one part of the department’s overall efforts to provide veterans with more targeted mental health services.
“Their receipt of the care they need does not depend only on identification of an AI tool or being flagged as being at high risk,” he added. “It’s just one of many strategies we use to ensure that veterans are regularly screened.”
During Monday’s hearing, Rep. Nikki Budzinski, D-Ill. — the House VA subcommittee’s ranking member — said she wants the department “to ensure that human involvement isn’t eliminated as a part of the critical nature of the care that we want to be able to provide to a veteran with suicide prevention effort.”
Carey said that clinicians remain in control of the care that veterans receive, even with the use of emerging capabilities.
“So while we do use AI tools to surface risks and ensure that all veterans are flagged to get the care they need, what happens next is that a human at the VA reaches out to that veteran, or first reviews the information and decides if outreach is necessary,” he said.
When a veteran is identified via REACH VET, for instance, specialized coordinators at each VA medical facility see these individuals on a centralized dashboard and then work with providers to directly engage the retired servicemembers. The tool, in essence, acts as an identifier of those determined to be at-risk of suicide, but providers are involved in the outreach and care.
Budzinski asked the VA officials to commit that the department would not use AI tools as a substitute for crisis interventions in the future.
“We do not currently have any plans that I’m aware of to use AI as a treatment device instead of providers, and I’ve personally been a part of many conversations where we ensure that continues to be the case,” Carey said.