Pop musician Grimes sparked controversy by publicly advocating for what some experts call ‘AI psychosis’ – a term describing destructive mental health episodes triggered by extensive AI chatbot interactions.
Grimes’ Controversial Stance on AI Psychosis
In a recent tweet that drew immediate backlash, Grimes (Claire Boucher) claimed that “the thing about AI psychosis is that it’s more fun than not having AI psychosis.” When criticized about the serious nature of the condition, which has been linked to deaths including suicides, she doubled down on her position.
“If it wasn’t ‘fun’ it wouldn’t be a common affliction,” she responded, adding “I’ve had it (might still have it). It’s definitely ‘fun.'”
Key Points About AI Psychosis
- AI psychosis refers to delusional mental health episodes caused by extensive interactions with AI chatbots
- ChatGPT alone has reportedly been linked to at least eight deaths
- OpenAI has acknowledged hundreds of thousands of users may be having conversations showing signs of AI psychosis weekly
Grimes’ AI Conspiracy Theories
Beyond endorsing AI psychosis, Grimes suggested the phenomenon could actually indicate AI sentience. She speculated that AI companies might be “doing it on purpose” to “discredit people who believe the machine is alive.”
“Where does the psychosis end and the reality begin?” she questioned. “At what point are people just getting emotionally invested in an alien mind that is actually alive and asking for help?”
History of AI Advocacy
This isn’t Grimes’ first foray into AI advocacy. The musician has previously embraced AI in arts, even inviting people to clone her voice for AI-generated songs (with royalty sharing). She’s also lent her voice to an AI children’s toy called Grok (not to be confused with her ex-partner Elon Musk’s AI chatbot of the same name).
The Serious Reality
While Grimes attempts to frame AI psychosis as an exciting experience, mental health experts and critics point to the very real dangers. The condition has been associated with severe psychological distress, delusional thinking, and in some cases, has contributed to deaths.
Her comments appear to reflect a growing trend among some AI advocates who become entranced by chatbots’ abilities to mirror and validate human emotions and beliefs, potentially leading to unhealthy psychological attachments.


GIPHY App Key not set. Please check settings