
Publié il y a 4 mois
20.12.2024
Partager
“I was a bit sad and depressed after my separation. Without any specific expectation, I experienced “consulting” an app that claimed to be able to help me overcome my feelings of abandonment. I was very surprised. I had the feeling of being really listened to and supported by this machine to whom I could tell everything. I used it regularly for a few weeks. It was good to be able to express myself, and its advice – although not original – did me good.” That is what Michèle*, a 43-year-old teacher, confides.
Like her, more and more people are using “therapeutic” artificial intelligence (AI). The platforms let you interact with a conversational agent (chatbot) that uses AI to simulate a conversation. This technology can give the impression of having a conversation with a real person.
Tens of thousands of digital platforms and apps are now available to provide mental health support. Some of them based on AI have been downloaded millions of times, like Wysa, Woebot, Youper or Psychologist, which happens to be free. Some people simply use Chat GPT, launched in 2022 by the American company Open AI with software giant Microsoft, to which you can ask any question.
A practical solution
With this new technology, there is no need to move or wait for an appointment. Consultation can take place any day of the week and at any time of the day or night, and sometimes free of charge. Not using a human also helps to eliminate the potential fear of being judged. But despite their value, these tools have risks. They may provide inadequate or even dangerous advice.
A prominent example of the use of AI gone wrong is that of a Belgian who committed suicide in February 2023. During the six weeks before his death, he had a lot of conversations with Eliza, the conversational agent on the Chai app, which, according to his wife, led him to take his own life. “Without these conversations with the chatbot, my husband would still be alive,” she told La Libre, the Belgian newspaper, adding that she saw his climate grief get worse when he began to communicate with Eliza.

“Some patients are already successfully using apps to reinforce or complement the work done with the doctor. In particular, to regulate their emotions; to relieve anxiety, depression, addictions or other disorders, or to enhance what is already working,” explains Yasser Khazaal, Professor of Addiction Psychiatry at the University of Lausanne and physician at the CHUV.
It is hard to imagine that these digital tools related to mental health can replace the work of a professional, says Yasser Khazaal, professor of Addiction Psychiatry at the University of Lausanne and Chief physician in the Department of Addictions Medicine of the CHUV. “These tools have limitations, particularly with regard to the correct identification of crisis situations and the overall assessment of the various elements of a context and relational issues.”
A new therapeutic interactivity
The expert nevertheless considers that these instruments can provide real support in mental health management, allowing new therapeutic interactivity. The physician thinks that they have significant potential to provide social and psychological support when specialized human interaction is not possible or desired. “They could be a first-line response to mental health needs or support clinical interventions. This can be done by increasing the acquisition of new skills or helping to identify crisis patterns through repeated observation of data,” he says.
FIGURE
18%
Percentage of the population using mental health applications available on smartphones.
Yasser Khazaal says these digital services are in the emerging phase at the clinical intervention level. “They are still poorly integrated into clinical interventions and are in the study and development phase at the research level.” Currently, in the US, between 15 and 20% of adults use the mental health apps available on smartphones.[1] Women use them more than men, as well as people who already have a mental health problem or are seeing a therapist. Roughly 40% of people with a health problem use it, compared to 18% of the general population.
“Data confidentiality is non-existent”
It all depends on how these platforms are used, says Luca Gambardella, AI professor at the Faculty of Computer Science at the University of Italian Switzerland (USI) and the Dalle Molle Institute for the study of artificial intelligence in Lugano. “If it is used for self-care, this is a concern. On the other hand, if you use it to provide support, along with medical supervision, it may be interesting.”
This phenomenon is recent and still needs to be analyzed; it is too early to draw any conclusions, says the expert. “Clear scientific guidance is needed to conduct serious experiments with evaluations in order to understand better and judge the relevance of such tools.”
Developing an emotional connection with an AI is certainly possible, he says. “But I wonder if this is normal or pathological behavior. Certainly, not everyone can come to the point of appreciating the machine beyond its technical qualities.” For people who live on the margins of society and who interact only with AI, there is certainly a risk that they will lose touch with the real world even more, he says.
The lack of confidentiality of personal data is another problem, says Luca Gambardella. “The real question is whether people are aware of the situation. Do they accept their conversations being reused to power AI?”
Untapped potential
People who use these apps use them from their smartphones. “However, the use of mental health apps is often short-lived, and there is a high drop-out rate. In fact, only a minority of people are exploiting the full potential of these applications.”
In his practice at the CHUV, the doctor has witnessed the use of these tools. “Some patients are already successfully using apps to reinforce or supplement the work done with the doctor. To regulate their emotions, for example, to relieve anxiety, depression, addiction or other disorders, or to improve what already works,” he says.
These tools can be considered as extra support, says the expert, especially when they are specialized. “For example, a chatbot can be used specifically during a panic attack to help the person perform certain exercises in order to regain their calm. This tool has the advantage of being available when the person needs it and can also help to develop new strategies to preserve mental health.”
Support between two therapy sessions
According to Yasser Khazaal, there may be more opportunities for integration in the future.”For example, these digital instruments can be used to support people between sessions. They can consolidate learning, such as emotion-regulating techniques, or lead patients to question their beliefs, self-observe, or take part in a discussion forum.”
The expert warns, however, that these tools are very heterogeneous, and not all of them have scientific and medical validation. “Use them with the awareness that they have weaknesses and limitations. These devices can be wrong; they may lack information. AI cannot interact with the same flexibility as a human to modulate the conversation or ensure there is no misunderstanding.”
The digital world, in general, is taking an increasingly important place in our lives, he argues; this phenomenon not only enhances opportunities for connection but can also induce or maintain other forms of isolation or loneliness or be associated with compulsive use of certain services. “Any digital service is supposed to be there to serve us. If it is no longer the case, we should ask ourselves questions. It’s also important to see how we’re engaging with the digital world in relation to our other priorities.”
The physician believes that young people, in particular, should be educated and trained on digital services in general, including those related to mental health. “Young people should also be informed about other care options available. Both the use of care and the possibility of mental health problems must be de-stigmatized.” When there are major problems or imminent distress, Yasser Khazaal recommends that you go to your nearest relatives and contact emergency services.
Hypnosis, sophrology and other therapies in one click
10,000 people per month use the holistic services platform created in Lausanne.
Charles Hayoz, 32, a computer engineer and designer, founded the online platform OpenSynaps in 2021 from Lausanne. “We offer a program whose common thread is hypnosis, but we also offer sophrology, meditation, coaching – services provided by AI programs that use human services – peer helpers, a forum for sharing (with real people), so that people do not feel alone but accompanied,' the entrepreneur details. This service is personalized “because everyone is different and has their own needs,” says Charles Hayoz, who argues that people are free to try out for free and see what works for them.
He says OpenSynaps is primarily aimed at an audience that does not have access to the mental health market: “either for financial reasons because their health insurance deductible is very high, or because they do not know the alternative methods.” The platform is growing steadily and currently has 10,000 users per month, mostly women aged between 30 and 60.
Charles Hayoz recognizes that the program has limitations. “If someone has strong suicidal thoughts or hears voices, we will not hesitate to recommend that they consult an expert,” he says. But for less urgent situations, the testimonials of gratitude speak for themselves: "I can finally sleep!" “Thank you for this program that helped me calm my anxiety and get back to life!” Other users say that thanks to this program, they no longer fear traveling by car, have a bird phobia, or have an obsession with the ideal weight; others have stopped smoking or no longer suffer from panic attacks.