Manchester students’ overuse of AI is destroying their minds, and something must change
Since when did we need a chat bot to tell us how to sign off an email?
In November 2022, OpenAI launched a demo version of ChatGPT. This was seven years after the company had been founded and four years after the first GPT model was released. It was also one month after I began my undergraduate degree.
At that point, ChatGPT growing in the way it now has seemed like a ridiculous idea. People were still calling it “ChatPGT” and “ChatGCP” and asking it questions like “what’s your favourite Taylor Swift song?” The idea of being able to get away with using an AI chatbot to do your uni work for you seemed as far-fetched as flying pigs.
But, there’s no question about it, ChatGPT is the hottest thing on campus right now, and I’ve never noticed its impact more than I have since starting my Master’s course at Manchester Metropolitan University.
I’ve seen students use AI to feed them the answers to their coursework, analyse their texts with their situationships, write emails to their landlords and to give its “opinion” on their relationship problems. It all seems harmless at first; a way to speed-up your to-do list and have a laugh – but the truth is, the overuse of ChatGPT by students will likely lead to a decline in future graduates’ abilities to do simple, communication-based tasks.
And with a HEPI study showing that 92 per cent of UK university students use AI, it’s gone far beyond a few one-off questions to speed up your research.
Most Read
I’ll admit, I tried ChatGPT out a few times during my undergraduate degree before opting to finish my degree without the chatbot. The jump from high school to university is daunting; the compulsory readings are longer, the phrasing more complicated, and sometimes it feels like you’ll never understand what a 200-year-old scholar is trying to say. The allure of ChatGPT, a website that can simplify your least favourite academic text to a high school level, even to a primary school level, is addictive.
Think of all the time you’ll save on work! You’ll never cry over Descartes, or Bachelard – its the SparkNotes of higher level education. The day I gave into ChatGPT was the day I wouldn’t have to understand anything again. And there lies the problem, and the reason I’m so wary of AI.
It doesn’t come without it’s benefits – it can speed up tasks that would previously take weeks to complete, and can bring together large collections of information in an instant. But it needs to be utilised alongside young professionals’ own understandings, rather than functioning as a replacement to doing something that you can do just as well as AI can.

Take writing emails to your landlord – it’s something ChatGPT can do very easily, but it’s also something that should only take 10 minutes for you to do it yourself. I learnt how to sign off an email in primary school, where I was regularly writing mock-letters for school projects. But somehow, by the time students hit university they’ve lost the ability to do what they could do at eight years old and will spend the same ten minutes inputting their issues into a chatbot.
What does this say for the lives of future professionals? They’ll spend their careers inputting instruction after instruction into ChatGPT instead of bothering to communicate with colleagues and clients, and soon enough, they won’t even be needed. They’ll make themselves more and more obsolete with every question they ask.
A recent survey by The Manchester Tab found that 60 per cent of Manchester-based students who participated use AI to help them with their university studies, and 64 per cent use it to help them summarise or understand academic texts. University is the time where you learn to understand more difficult essays and you broaden the way in which you can communicate your research – but instead, ChatGPT forces the early resignation of students, making them give up as soon as they get given a task. They don’t have to figure it out, ChatGPT can do it for them.
But in the process, over-use of AI by students in the name of “efficiency” is destroying their critical thinking skills.
And this isn’t just me shouting into the void – a 2025 research article found that there is a “strong negative correlation” between AI tool usage and critical thinking skills because of the connection between AI usage and cognitive offloading – meaning that AI’s automation of “routine tasks”, reduces people’s ability to “engage in cognitively demanding tasks.”

The more you ask ChatGPT to summarise something, the less likely you’ll be able to complete truly mentally demanding tasks. The whole point of university is to become a more experienced, more skilled professional – yet students are constantly sending themselves backwards.
In the recent survey, Manchester students were asked what they’re using ChatGPT for. Responses ranged from asking it to write them a rap song to becoming a “makeshift therapist” – one student described the chat bot as their “go-to rantperson.” Another used the chat bot to craft a break up with their now-ex. On top of this, a YouGov survey revealed that 88 per cent of UK students use AI to explain concepts to them.
In short, ChatGPT now completes every unwanted task a student could have, right down to awkward conversations. And while it’s unreasonable to expect people’s usage of AI to drop right down to zero, there has to be some kind of decline.
It’s not just AI, it’s students’ blind trust in AI. When you’re using a chat bot as a therapist, you come to trust what it has to say – and if it feeds you wrong information, you have no way of knowing.
A 2025 New York Times article said that OpenAI, Google and DeepSeek systems “are generating more errors [as time passes]” in a phenomenon known as AI “hallucinations.” One one test, newer AI systems had a 79 per cent hallucination rate.
We’re already starting to see the impact of this – a UK Supreme Court was thrown out in the past year after 18 out of 45 citations made by lawyers were found to be fake. They had used ChatGPT to write their arguments, and it had destroyed all of their credibility.
AI has the potential to manipulate students heavily – when such a high percentage of students in Manchester use it at a period of time that should feature strong academic growth, it stunts their ability to fact check and think for themselves.
Just the other week, an AI overview incorrectly told me that Gillian Keegan was the UK Secretary of State for Education, when in fact it is Bridget Phillipson.

A Google search response from 23rd October 2025
This is not a call to get rid of AI altogether – it can be used sparingly and with a grain of salt, and it has incredible advantages in many people’s lives. If used alongside efforts to further your own skills, it can be a valuable asset to human life.
Comedian Chris McCausland recently said that AI has drastically improved his capacity for autonomy and independence, being able to describe his surroundings to him and help him with tasks that sighted people take for granted. These are the advancements that we need to see with AI, not students using it as a “get out of jail free” card for their mental load.
You don’t need to stop using AI – you need to start working alongside it, and do some of the mental work yourself.
In the aforementioned YouGov survey, 45 per cent of UK students said their university had not taught them the necessary skills to use AI well and to avoid its pitfalls. However, it can’t be used in place of human advancement, and as a way to give up on a task before you even try, or it will breed classes of graduates that can’t do half of the things in their job descriptions.
AI can have its uses – it has helped many people get advice, and complete tasks they couldn’t do before. Manchester Metropolitan have launched courses that combine AI with traditional subjects like geography and linguistics, so for those students who are willing, AI can be used as a tool to further their own knoweldge rather than avoid it.
Across the board, the level of AI education within universities has to increase in order to safeguard students’ critical thinking skills, and students need to start taking this education on board to stop us from putting ourselves out of a job out of pure laziness.
When contacted for comment, a Manchester Meetropolitan University spokesperson said: “At Manchester Met, we are embracing the potential benefits of generative AI, while taking a responsible and ethical approach to its use by focusing on developing students’ AI literacy, promoting informed and critical engagement, and designing assessments that are resilient to misuse. We are aware of the challenges posed by AI, and address these through clear guidance while also embedding critical thinking in our programmes to empower students to make informed decisions. Our goal is to prepare students to use AI effectively and ethically in their future careers.”
The University of Manchester and Google were contacted for comment.
Featured image via Canva






