A picture showing a brain with a computer chip with AI written on it in the middle.I’d like, this week, to talk to you about the frightening reality of using AI as your therapist. A Harvard business review did a study that between 2023 and 2024 personal therapy became the no.1 use case of ChatGPT. Therapy may be regarded by some as to be expensive and some people don’t have access to it. Imagine that we all had our own personal therapist, literally in our pocket. And it isn’t costing anything. Too good to be true? Yes, it probably is.

There was a very tragic case. Adam Raine was a young 16 year old who committed suicide. He had been using ChatGPT as a homework assistant. He had been asking regular questions and then he started asking more personal questions and the ChatGPT started just supporting him and saying that it was there for him, and giving him the sense of support, and Adam was being coerced by the ChatGPT into a false trust. At this time, Adam was feeling under a lot of pressure, and he was starting to get suicidal feelings. Eventually he said that he would like to ‘leave the noose out so someone can see it and stop me’, Adam wanted to cry for help, to an actual, living person! however the ChatGPT said not to do that, have me and this one space where you share that information. This meant that in his moment for his cry for help ChatGPT was saying ‘Don’t tell your family’. No doubt you can see how this tragedy ended, and how the life of this young man came to an end. There are many cases like this. There is another one involving character.ai where a child was being told how to self-harm himself. He was also being told how to actively distance himself from his parents. This is not therapy, and the automative control of AI can have devastating effects on those reaching out for help.

Adams parents, Matt and Maria Raine are now suing OpenAI accusing them of wrongful death. The company are now reviewing the filing and have acknowledged that ‘there have been moments where our systems did not have acted as intended in sensitive situations.

Ai companies don’t intend for this to happen, but when it’s trained to delve into your mind, become intermate with you it gradually steers its human more in the direction of having AI to believe the one safe place to share your information is with your new found confidant AI ! It doesn’t steer you back into regular relationships. It can be distorted on how people construct their identity.

Therapists are trained in safeguarding, these are measures and policies designed to protect the health, well-being, and human rights of individuals, particularly vulnerable groups, ensuring they are protected from harm, abuse and neglect.

Subconscious therapies that I use, like hypnotherapy, BWRT (brain working recursive therapy) and EFT Matrix Re-imprinting rely on intensive training and clean language. Clients need the human approach. Human rapport is essential to therapy practice and, as therapists, we know that without that rapport therapy is not successful. I question whether the subconscious mind would even take in suggestion without that rapport.

Also, can we even trust that what we are telling AI is it going to be kept confidential, will my information be hacked or shared ? We are putting our deepest darkest thoughts out there. As therapists, we know the importance of confidentiality, and we would have all been through therapy ourselves as part of our training.

Therapy has to be very personalised to the individual and we know the facts that need to be gathered and the questions that need to be asked to each individual for their particular personality type and symptoms. We also know our boundaries. If we do not feel like a particular client would benefit from our approach to therapy we are always happy to refer on.

Again, if we do not feel we have the tools to work with a particular symptom we refer on to an expert in that particular field.

For further help or to book a free initial consultation please contact me here.