Therapy by AI: Could ChatGPT help students out with their mental health?
Therapy is for everyone, but many cannot afford it, can’t access it, or struggle to speak about such private and personal details – is speaking to a bot the answer?
You’ve almost definitely heard of ChatGTP. The biggest newcomer on the scene since the mango elf bar, the OpenAI chatbot has set the university scene on fire.
While often used nefariously, to write essays or résumés for example, the chatbot can also be utilised to organise the absolute hellscape that is many students’ work schedule.
But can this relationship ever break free of the workplace? Can it ever become something more? Today we ask whether ChatGTP could ever be a real solution to mental health challenges.
The Argument
According to recent NHS figures, the number of 18–25-year-olds accessing mental health services is at a record high; with a rise of almost a fifth in people being referred to mental health services compared to pre-pandemic levels.
Therapy is for everyone, but many cannot afford it, can’t access it, or struggle to speak about such private and personal details.
It seems strange not to wonder then, if it would be easier to talk to a bot, rather than to a person; knowing they’re not actually capable of judging you. So, at a time of mental health crisis among students, could this bot be an unsuspecting ally?
The nonprofit Koko, which specialises in behavioural health and mental health research, recently ran an experiment using ChatGPT to give some mental health support to those in emotional distress.
We provided mental health support to about 4,000 people — using GPT-3. Here’s what happened 👇
— Rob Morris (@RobertRMorris) January 6, 2023
Founder and MIT grad Rob Morris, in a viral tweet in January, claimed that the software helped over 4000 people with over 30,000 requests for mental health support. Interestingly, the AI’s advice was also rated higher than the advice given from humans.
Most Read
As a predicative language model, the AI is essentially trained to be a natural language processor. Any information it gives is sourced from various credited books, articles and websites. It essentially crowdsources advice, from everywhere on the internet while predicting exactly what you need to be told.
For many it seems dystopian, a Ryan-Gosling-in-Blade-Runner type beat. But let’s not forget that huge parts of our lives are already ran by social media. So many of our mental health issues are associated with the internet, and its algorithms that try their absolute hardest to give you some form of body dysmorphia.
Our own experiment
When asked about common mental health problems, the bot gave surprisingly down to earth, if basic, answers.
So maybe instead of lining up Hurt by Johnny Cash, firing up a Pot Noodle and a having sit down shower, hop over to Chrome and see what ChatGPT can do.
Verdict?
We can certainly see the appeal. It’s easy, and it beats waking up at 8:45 to join a queue of 30 people on the phone line, to arrange a mental health appointment.
Additionally, because of the checks and balances the AI developers use, the bot filters out insensitive or damaging answers. This also means that the bot will also not diagnose you, or give out any potentially harmful advice.
But can therapy really be therapy if there is no real human experience behind the advice you’re getting? Is a website that just coldly crowdsources bland advice, repackages it and then dispenses it – all while pretending to be a human – a healthy long-term strategy?
Despite its attraction, we really can’t say if AI can help with any mental distress that you, a loved one or any close friends might be going through. If what you’re dealing with is something that is seriously impacting your life, it is advised that you let someone know how you’re feeling, and seek professional help.
One thing is for sure, as AI becomes more and more a part of our lives, its uses will certainly change. For now, it seems we’ll have to keep experimenting and see what works.
A list of wellbeing services available to Bristol students can be found here.
Bristol Nightline offers a confidential, impartial and non-advisory listening and information service at the following times:
The phone service is available 8pm to 8am Tuesday, Wednesday, Thursday, and Friday at 01179 266 266
The IM service is available 8pm to 12am Tuesday, Wednesday, Thursday, Friday, and Saturday (the link can be found on the ‘services’ tab of their website)
Related stories recommended by this writer:
• Bristol UCU calls off seven days of strikes following ‘significant progress’