There are two recent news releases that are interesting to read together: one is that artificial intelligence giant Open AI recently released their latest version, GPT-4o. GPT-4o runs much faster than its predecessors, as fast as the Flash drinking three cups of espresso. Not only that, GPT-4o is more natural and funny when chatting, and sometimes even flirts with you, it feels like he is just trying to make you happy. This digital person who can not only talk well, look at pictures, help you discuss various images, handle foreign language translation, but also read your emotions from your expression, is simply an all-round player. If you ask it what I think of this selfie? He might say: "Wow, this smile can light up the whole room!". The coolest thing is that GPT-4o is also equipped with a memory function, which means that it can remember what you said before, just like your best girlfriend. In addition, it allows interruption, which means that you can interrupt at any time and he will not be angry, and the rhythm of the conversation is super natural. There is a delay between questions and answers? It can't exist!
Meanwhile, in the real world, Mr. Li Songwei, a big name in the domestic counseling field, has recently encountered an ethical storm. His client Wang Xinmiao accused him of having an intimate relationship during the counseling process, which has attracted widespread attention. Wang Xinmiao provided a number of pieces of evidence, including payment records and testimonies from friends, trying to prove the existence of this relationship. However, Mr. Li Songwei denied all the allegations and the two sides went to court.
On the one hand, the technology is readily available, pervasive, and responsive at any time. On the other hand, there are the long learning path and growth experience of psychological counselors, strict ethical rules, different professional boundaries between the counselors and the interviewers, and the chaos in the market during its growth. Judging from the video of the press conference, the emotional value provided by GPT-4o is no less than that of professional psychological counselors. Has the profession that was once thought to be the least likely to be replaced by AI been "slapped in the face" so quickly? In this market where demand is currently booming, will there be a triple jump in the supply method?
01 How far has AI gone in the field of psychotherapy?
If you’re still unclear about the extent of AI’s development in the field of mental health, let’s take a look at three studies.
Study 1 was published in Frontiers in Psychology in February 2024. The study compared the differences between psychologists and artificial intelligence in social intelligence. It is well known that social intelligence is essential for the success of counseling and psychotherapy. For both psychologists and artificial intelligence systems, social intelligence is the ability to understand people's feelings, emotions, and needs during counseling. The study used a total of 180 students majoring in psychological counseling at King Khalid University, including 72 undergraduates and 108 doctoral students. The artificial intelligence models included ChatGPT-4, Google Bard, and Bing. The results showed that ChatGPT-4 surpassed all psychologists by 100%. Bing performed well among doctoral and undergraduate holders, surpassing 50% and 90% of human samples, respectively. In contrast, the difference between Google Bard and undergraduates was not significant, but the difference between Google Bard and doctoral students was significant, with 90% of doctoral students outperforming Google Bard.
Six different ways artificial intelligence changes mental health, data source see reference 1
Another study, published in Translational Psychiatry in February 2016, showed the outstanding performance of AI in early screening and precision medicine. As is well known in the industry, autism spectrum disorder (ASD) and attention deficit hyperactivity disorder (ADHD) are the most common neurodevelopmental disorders in children. According to the latest estimates from the Centers for Disease Control and Prevention (CDC), the prevalence of ASD in the United States is about 1.5%, while the prevalence of ADHD is as high as 9.5%. These data show that the two diseases have a fairly high prevalence in the child population. However, there is a significant overlap in the behavioral characteristics of ASD and ADHD. The overlap of these behaviors may give clinicians a headache when diagnosing. For example, a child with severe ADHD may have difficulty in social interactions, but this is usually because they are not paying attention to the speaker or interrupt the conversation because of impulse , rather than because of a fundamental misunderstanding of social cues . In contrast, children with autism often have more difficulty understanding and responding to social cues, which is a more fundamental challenge.
But this is not to say that autism and ADHD cannot co-exist. In fact, the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) has recognized how often symptoms of ASD and ADHD co-occur and revised the diagnostic criteria. In the previous edition of DSM-IV, the dual diagnosis of autism and ADHD was excluded. However, with the advancement of clinical research, DSM-5 has officially recognized the comorbidity of autism and ADHD. This change reflects the medical community's deeper understanding of children's complex behavioral patterns, and also contributes to more accurate diagnosis and more effective treatment. In order to better help children and their families affected by these disorders, understanding these overlapping behaviors is crucial to developing effective interventions and support strategies.
The study confirmed that through machine learning methods, autism spectrum disorder (ASD) and attention deficit hyperactivity disorder (ADHD) can be distinguished with high accuracy using a small number of features. Among them, four algorithms (SVC, LDA, Categorical Lasso and Logistic Regression) showed an accuracy of 0.962-0.965 in this classification task, showing potential clinical application value.
Study 3 was completed by the team of Professor Kaiping Peng of Tsinghua University and published in the Scientific Report under Nature magazine in March 2024. The study found that ChatGPT-4 performed well in simulating human emotional responses. In the first experiment, the researchers asked ChatGPT-4 to imagine encountering fearful situations (such as encountering snakes in the backyard) and happy situations (such as encountering friends on the street) to observe how conservative it was in investment decisions. In the second experiment, the researchers set three conditions: let ChatGPT-4 imagine that it had just watched a happy or anxious movie, or not watched any movie, and then asked it how much it was willing to donate to a neighbor who urgently needed surgery. The results showed that ChatGPT-4, like humans, was more conservative when encountering fearful situations and more inclined to take risks in happy situations; it also showed prosocial behaviors similar to humans, such as donating less in anxious situations. This study shows that AI can regulate reactions through emotional cues . Although this does not mean that AI really has emotions, it provides new possibilities for emotional interaction applications and also brings ethical challenges.
02 AI is a double-edged sword
It sounds like AI's performance in the field of psychological therapy is already amazing. For example, as mentioned above: it exceeds the performance of human professionals, has a high accuracy rate in distinguishing overlapping behaviors, and has already developed emotional tendencies similar to humans. However, it is nonsense to discuss the supply method without considering the demand level.
Public data shows that as of the end of 2020, there were only 40,000 psychiatrists in China, or 2.9 psychiatrists per 100,000 people, far lower than the 15 in the UK and 12 in the US. In terms of mental health services, only 20 people per million people in China can provide services (including counselors), while the number in the US is 1,000, 50 times that of China. According to the 2022 edition of the "China National Mental Health Report" released by the Institute of Psychology of the Chinese Academy of Sciences based on a survey of nearly 80,000 college students, the detection rate of depression risk among college students is as high as 21.4%, and the detection rate of anxiety risk is as high as 45.28%.
Judging from the data alone, psychological counseling should be a rapidly developing industry: practitioners are making a lot of money, and investors are flocking to startups in this field. However, the reality is that the most popular business in this industry is teaching novice counselors. Novice counselors who have just entered the industry have problems getting orders, and they first have to face a learning cost of tens of thousands of yuan, which can take months to years. Customers who really have needs can't easily find suppliers who can match them, and the biggest problem for suppliers who are able to provide services is that they can't find where the customers are. The public opinion has more impressions of this industry: it doesn't "cure" but "cause depression", fake and shoddy institutions are rampant, there are many ways to enter the industry, and the pricing is arbitrary...
Just like depression can be divided into mild, moderate and severe, the needs for psychological treatment can basically be divided into mild, moderate and severe needs. Mild needs include support and companionship, simple complaints; moderate needs include medium- and long-term intervention with professional methods; severe needs require intervention by psychiatrists, drugs or hospitalization.
The market trend is also influenced by the willingness and ability to pay. For example, many people with serious illnesses cannot find suitable resources and cannot afford the high cost of treatment.
In this industry context, increasingly accurate and personalized AI companionship does provide a timely and accessible use scenario for those with mild needs, at least at the level of mild needs. In addition to the foreign products we have written about before, such as Replika, Woebot, and Therabot, an application designed for digital therapy that claims to be completely driven by generative AI, there are more and more local AI healing products. Hardware robots such as "Bei Xiaoliu" launched by the Sixth Hospital of Peking University, "Xiaotian", a virtual psychological companion launched by West Lake Xinchen, and "Xiaoqing Zidao" which focuses on counselor training, etc....
In related news, last year the National Eating Disorders Association (NEDA) decided to discontinue an AI chatbot called Tessa. Although Tessa was designed to provide psychological support and advice to people with eating disorders, user feedback indicated that some of its suggestions could trigger or exacerbate eating disorder behaviors. For example, some users reported that Tessa provided guidance on how to count calories and dietary restrictions, which clearly ran counter to its supportive role.
The Character AI platform, founded in 2022 by a former Google Brain employee, launched a "Psychologist" robot that has been favored by millions of people and has processed more than 95 million messages. Although its core function claims to introduce cognitive behavioral therapy (CBT), there are still doubts about its diagnostic accuracy and therapeutic effectiveness. The Character AI official website also clearly stated that "everything the character says is made up." Another issue that worries parents and psychologists is that the dependence on AI psychological products has led to the so-called "social deprivation" phenomenon. After teenagers use smartphones in large quantities, the time they spend communicating and playing with friends face to face has been drastically reduced. These face-to-face interactions should have been the key to building real friendships and social relationships. As a result, today's children's social skills and self-confidence have gradually declined. Chat software like ChatGPT, due to its highly anthropomorphic features, further blurs the boundaries between reality and virtuality, increasing the risk of teenagers becoming addicted to the virtual world.
03 Risk
In addition to the above risks, AI also has ethical lawsuits like the one faced by teacher Li Songwei. Data privacy and security issues are also a minefield. One of the major principles that must be followed in psychological counseling is the principle of confidentiality. Whether it is the need to refer to manual services in the future or the use of large amounts of data for training of AI products, conflicts with the principle of confidentiality are inevitable. In addition to ethical issues, accuracy and reliability issues are always the sword of Damocles hanging over their heads - once AI gives inappropriate advice, it may be a secondary injury to many consumers who really need healing. The boundary issue between the real world and the virtual world has also been thought-provoking: the more advanced AI develops, the more like humans it will be. Can AI really replace human counselors? Will it lead to people's alienation from real interpersonal relationships? It poses a threat to humans' sense of the real world...
The emergence of AI represented by GPT-4o has brought new possibilities to the psychological counseling industry, but it is also accompanied by new challenges. How to find a balance between technological progress, commercial interests and ethical adherence, and ensure that the psychological counseling industry does not lose its most fundamental professional ethics and human care while developing rapidly is an eternal problem. Providing visitors with better mental health services may be the future jointly built by AI and human counselors. The prospects are bright, but every step is treading on thin ice .
references
1. Avasthi, S., Sanwal, T., Sareen, P., & Tripathi, SL (2022). Augmenting mental healthcare with artificial intelligence, machine learning, and challenges in telemedicine. In Handbook of Research on Lifestyle Sustainability and Management Solutions Using AI, Big Data Analytics, and Visualization (pp. 75-90). IGI global.
2. Sufyan, NS, Fadhel, FH, Alkhathami, SS, & Mukhadi, JY (2024). Artificial intelligence and social intelligence: preliminary comparison study between AI models and psychologists. Frontiers in Psychology, 15, 1353022.
3. Duda, M., Ma, R., Haber, N., & Wall, DP (2016). Use of machine learning for behavioral distinction of autism and ADHD. Translational psychiatry, 6(2), e732-e732.
4. Zhao, Y., Huang, Z., Seligman, M., & Peng, K. (2024). Risk and prosocial behavioural cues elicit human-like response patterns from AI chatbots. Scientific reports, 14(1), 7095.
This article comes from the WeChat public account "CyberMed丨CyberMed" (ID: cybermed2050) , author: FloraWYH, and is authorized to be published by 36Kr.





