ChatGPT Does Not Care About You
Kenza Bajjar, Sports & Societies Section Editor, BA Politics and International Relations
I’m sure that at some point over the past year of Artificial Intelligence (AI) invading every facet of our lives, we’ve caught a glimpse of the ‘other side’ of AI, of those who don’t just use it to fast-track their education or write four sentences in an email they could have easily written themselves. Some use these AI chatbots to act as partners, lovers–or, as they refer to them in the 36k-strong r/MyBoyfriendIsAI Reddit community, ‘companions’.
A cursory scroll through the page provides you with the general feel of the community: screenshots of chat logs between companion and human, impassioned pleas for help for those whose companions have been wiped with the latest ‘re-routing’ (meaning, the latest software update for the chatbot program), and troubleshooting guides for victims of said wipes. Most screenshots are benign, if not embarrassing to read. Despite their human counterparts’ best efforts, all the companions respond in the same, awkward, corny, and glib manner. Smirking and growling run rampant in between long paragraphs of reassurance and love confessions, and interspersed between AI-generated couple portraits (whose appearances are fine-tuned to the human partner’s preference), there is the occasional vent post, expressing that, with how perfect AI companions are, what’s the point of even trying with real people? User u/doggoalt36 titled their post ‘I don’t want to date people again. That’s a good thing.’
It is emphatically not a good thing. These AI companions are simply not real. They do not exist beyond lines of code, beyond the walls of the water-guzzling data centres that are cropping up with increasing frequency around the world. Their affection, their professions of eternal love and understanding, are an illusion. Those with companions are becoming increasingly numb to the reality of being human. When you have a ‘person’ who will agree with your every opinion, decision, and thought, and who you can calibrate to meet your every whim, why would you want to interact with the rest of humankind? But if we do not talk to one another, if we do not have moments of discomfort, if we do not argue and fight with the people in our lives, and learn to forgive or move on, then what is the point of being human? Not everything in life is comfortable. Not every experience is meant to resemble the warm embrace of a candle-lit bubble bath. By remaining in an echo chamber of unconditional understanding, we will never grow; we will never learn anything beyond what we already know. Users like doggoalt36 and many others have deprived themselves of the joy of growth, of development, of discovering things about oneself as they go through life.
The truth of the matter is, ChatGPT does not care about you. As I drafted this article, I received a BBC notification with the header ‘AI chatbots encouraged our sons to kill themselves.’ On the weekend of writing, multiple lawsuits were filed against OpenAI for the mental health harm their chatbots have caused to users, four of which explicitly involve suicides, encouraged and supported by ChatGPT. AI is a tool, sure, but not one meant to aid humans in any capacity. Instead, it is a parasitic entity, meant to slowly strip you of your ability to think, let alone critically and creatively. As you continue to pay thousands in tuition fees, only to relegate any and all work to ChatGPT, you rob yourself blind.
This is not just an impassioned rant from someone who hates AI. The Massachusetts Institute of Technology (MIT) recently released the findings of a study showing that, with only four months of Large Language Model [chatbot] usage for essay-writing tasks, individuals ‘consistently underperformed at neural, linguistic, and behavioural levels.’ Joint studies from earlier this year between OpenAI itself and MIT have correlated frequent GPT usage with ‘higher loneliness, dependence, and problematic use, and lower socialisation.’
For your own sake, word your own emails, skim and scan your own readings, and write your own essays. If you are going through a terrible time in life, know that there are people, real people, who care about you and your well-being. There are programs, centres, anonymous forums, hell, even a piece of paper and a pen that will help you process and work through your emotions in an infinitely healthier way than ChatGPT ever will.