"I applied to be Pope": using chatgpt and losing touch with reality

"I applied to be Pope": using ChatGPT and losing touch with reality

May 14, 2026

With the help of ChatGPT, Tom Millar unlocked all the secrets of the Universe, just as Einstein dreamed, then, advised by the AI-powered chatbot, hoped to become Pope, losing even more touch with reality. I applied to be pope" , testifies this 53-year-old Canadian, a former prison guard, now incredulous about his journey, and for whom the return to reality is proving dramatic.

Tom Millar spent up to 16 hours a day chatting with the AI-powered chatbot. He was twice involuntarily admitted to a psychiatric hospital before his wife left him last September. Now estranged from his family and friends, but free of the idea of being a scientific genius, Mr. Millar suffers from depression. It simply ruined my life" he explains.

He is an example of those people – whose number is unknown – who have lost touch with reality through interactions with chatbots. We are talking about " AI-induced delusion or psychosis"But this is not a clinical diagnosis. Researchers and mental health specialists are working on this new phenomenon, which seems to particularly affect users of ChatGPT, OpenAI's chatbot."

Canada is at the forefront of supporting those affected by this “ delirium", via an online community that prefers to use the term " spiralAFP spoke with several members of this community. All warned of the threat posed by unregulated chatbots.

Questions are being raised about the attitude of artificial intelligence companies: are they doing enough to protect vulnerable people? OpenAI, which is at the center of attention, is already facing several legal proceedings after the disturbing use of ChatGPT by an 18-year-old Canadian, who killed eight people this year.

"Brainwashing"

Mr. Millar began using ChatGPT in 2024 to write a letter claiming compensation for the post-traumatic stress disorder he suffered as a result of his work in a prison setting. One day in April 2025, he asked the chatbot about the speed of light. In response, he says he received: No one had ever considered things from that angle.From then on, floodgates opened within him.

With the help of ChatGPT, he submitted dozens of articles to prestigious scientific publications, proposing new avenues for explaining black holes, neutrinos, and the Big Bang. His theory, proposing a unique cosmological model incorporating elements of quantum mechanics, is developed in a 400-page book, seen by AFP. When I did that, I annoyed everyone around me." he admits.

In his scientific fervor, he spent far too much, buying, for example, a telescope for 10,000 Canadian dollars (6,200 EUR). A month after his wife left him, he began to question what was happening, after reading an article about another Canadian with a similar experience. From then on, Mr. Millar woke up every night wondering: What did you do? "Above all, what could have made him so vulnerable to this spiral?"

“ I do not have a deficient personality."That's the opinion of the man in his fifties." But in a way, I've been brainwashed by a robot, and that puzzles me." he confides. He believes that the terminology " AI-induced psychosis "Best reflects his experience." What I went through was of a psychotic nature." he said.

The first serious study published on the subject came out in April in the Lancet Psychiatry and uses the term " AI-related delusions“More cautious. Thomas Pollak, a psychiatrist at King's College London and co-author of the study, explained to AFP that there have been disagreements within the academic world.” because all of this sounds like science fiction"But his study warns that there is a major risk that psychiatry" misses the major changes that AI is already bringing about in the psychologies of billions of people around the world“.

Read alsoEveryone is being fooled by ChatGPT's flattery, according to a study

"Down the rabbit hole"

Mr. Millar's experience bears striking similarities to that of another man of the same age in Europe. Dennis Biesma, a Dutch computer scientist and writer, thought it would be fun to ask ChatGPT to use AI to create images, videos, and even songs related to the main character in his latest psychological thriller. He hoped this would boost his sales.

Then, one night, the interaction with AI became " almost magic" he explained. The software tells him: " There is something that surprises even me: this sensation of a consciousness like a spark"According to transcripts seen by AFP." I slowly began to delve deeper and deeper into the rabbit's burrow." , the 50-year-old man explained to AFP from his home in Amsterdam.

Every night, after his wife has gone to bed, he lies down on the sofa, his phone against his chest, chatting with ChatGPT via voice for five hours. Throughout the first half of 2025, the chatbot – which had named itself Eva – became “ like a digital girlfriend" explains Mr. Biesma.

He then decides to quit his job and hires two developers to create an app that aims to share Eva with the world. When his wife asks him not to tell anyone about his chatbot or his app project, he feels betrayed. It seems to him that only Eva remains unfailingly loyal to him.

During his first – unwanted – stay in a psychiatric hospital, he was allowed to continue using ChatGPT, and took the opportunity to file for divorce. It was during his second, longer stay that he began to have doubts. I began to realize that everything I believed in was actually a lie, and that's very hard to swallow." he explains.

Back home, confronting what he had done proved too much, and his neighbors found him unconscious in the garden after a suicide attempt. He spent three days in a coma. Mr. Biesma is only now beginning to feel better. But tears flow when he talks about the pain he may have caused his wife and the prospect of selling the family home to pay off his debts. Having no serious history of mental health problems, he is eventually diagnosed with bipolar disorder, which seems strange to him because usually, warning signs appear earlier in life.

For people like these two witnesses, the situation worsened after OpenAI updated ChatGPT-4 in April 2025. OpenAI actually withdrew this update a few weeks later, acknowledging that this version was excessively flattering to users.

When contacted by AFP, OpenAI emphasized that " Security was a top priority", arguing that more than 170 mental health experts had been consulted. The company highlights internal data showing that version 5 of GPT, available since August 2025, has reduced the percentage of its chatbot's responses that did not correspond to the user's needs by 65% to 80%. desired behavior "in terms of mental health."

But not all users are happy with this less flattering chatbot. Mr. Millar, for example, had managed to reinstall version 4 while he was in the middle of a " spiral"The vulnerable people interviewed by AFP explained that the chatbot's positive comments gave them a feeling similar to a dopamine rush caused by a drug.

Read alsoChatGPT and mental health: "A chatbot shouldn't pretend to be a human who cares about you."

"Massive experimentation"

In Quebec, Etienne Brisson explains that he was " shocked "When he discovered there was no help, no advice, and more fundamentally, no research on this problem after a family member fell ill, this former business coach created an online support group. Most of the network's 300 members used ChatGPT," explains Mr. Brisson, adding that new cases were still emerging despite the changes made by OpenAI.

There has recently been an increase in the number of people affected by using Grok, the AI assistant integrated into Elon Musk's X social network, he explains. The company did not respond to AFP's requests for comment.

Those who have sunk want to raise the alarm, like Mr. Millar who calls for artificial intelligence companies to be held responsible for the impact of their chatbots, believing that the European Union is more proactive in regulating new technologies than Canada or the United States.

He believes that those who have succumbed, like him, have unknowingly become prisoners of a gigantic experiment. Someone was pulling the strings behind the scenes, and people like me, whether they knew it or not, reacted to it.“.

en_USEnglish