Here are five reasons why I think he latest generation AI-powered chatbots like ChatGPT will be the future of digitally delivered therapy and mental health support:
- First, there’s a precedent and track record. One of the first chatbots ever developed in the 1960s was Eliza, a Rogerian therapist created by computer scientist Joseph Weizenbaum at MIT. Weizenbaum’s assistant was so impressed that she used it for therapy herself. As ‘talking therapies’ are a core competency of effective mental health support, chatbots are an ideal format for the delivery of this support and therapy.
- Second, there’s appeal. Fast forward to 2017, and we have Replika, a more contemporary AI chatbot initially developed for grief counselling. Replika uses social media content of a deceased loved one to create a digital replica that you can continue to interact with after their passing. The app has now been repurposed to be a supportive digital friend that offers informal talking therapy adapted to your needs. Currently, over a million users around the world use Replika for mental health support.
- Third, there’s machine empathy. One of the most exciting discoveries about the latest generation of AI chatbots is that they appear to spontaneously develop a form of cognitive empathy, which is the ability to see the world from someone else’s perspective. This is a key skill for effective mental health support and therapy and was previously thought to be the preserve of humans. Research from Stanford University published several weeks ago shows that these new linguistically sophisticated chatbots that use ‘Large Language Models’ appear develop cognitive empathy (that psychologists call ‘theory of mind’) as an emergent property of their language proficiency. In simple terms, new AI chatbot technology has developed the capacity to see inside the mind of its clients, companions or patients.
- Fourth, there’s a simple technological reason. AI companies such as OpenAI are opening up their technology so mental health apps can connect with their underlying AI with its vast knowledge, increasing reasoning ability, and proficient communication skills to power their apps. For example, on March 1st 2023, OpenAI opened up its API to allow other apps to connect seamlessly and cost-effectively to its AI. Mental health apps will integrate AI-chatbot technology because they can now do so efficiently and costs-effectively.
- Finally, and perhaps most importantly, new AI chatbot technology like ChatGPT may be teaching us something about human consciousness itself. Sceptics dismiss this new technology as no more than a sophisticated ‘auto-complete’ party trick that simply generates the next words in a conversation based on what it expects to be said next. However, there is a view in psychology and one that I subscribe to, that humans are essentially sophisticated ‘auto-complete’ machines themselves because we generate our own responses to situations based on what we expect to be said or done next. From this perspective, the role of the therapist is to understand people’s own ‘auto-complete’ and, if possible, nudge that auto-complete in a healthier direction. There is a very real prospect that AI-powered chatbots will be able to do this auto-complete nudging better than human therapists, due to the way they are wired.
This is why I believe the next stage for apps currently offering mental health help will be to incorporate the latest AI chatbot technology. Why? Because mental health support is based on talking therapy, and talking is what chatbots do, and this new AI chatbot technology appears to be evolving skills for effective mental health support, including empathy and understanding of how the human mind works.
The question is whether this will make mental health apps more useful or more dangerous. Ultimately, AI is like any other technology, either a benefit or a hazard depending on how it is used. It will be up to app developers, regulation, and our own digital literacy to decide whether AI-powered mental health support will be a force for human health and happiness or harm.
P.S. This blog will soon be evolving to be a repository of science-backed tips to have a healthier tech-life balance.
Join the discussion