
The backlash against AI companionship products highlights a growing concern about tech companies’ attempts to monetize loneliness while potentially worsening social isolation.
The Friend Phenomenon and Public Reaction
New York City subway riders made headlines this fall by defacing advertisements for ‘Friend,’ an AI companion necklace marketed as someone “who listens, responds, and supports you.” The widespread vandalism of these ads became a meme, revealing deep public anxiety about AI being positioned as a solution for human loneliness.
The Loneliness Business Model
Silicon Valley has increasingly pivoted toward AI companions as a business opportunity, with offerings including AI travel guides, dating app assistants, and chatbot relationships. This trend emerged following the COVID-19 pandemic and the surgeon general’s declaration of loneliness as an “epidemic” in America.
Critics like Lizzie Irwin from the Center for Humane Technology point out the irony: “They sold us connection through screens while eroding face-to-face community, and now they’re selling AI companions as the solution to the isolation they helped create.”
The Evolution of Digital Relationships
Social media initially connected people with shared interests but gradually shifted toward parasocial relationships with influencers and content creators. AI companions represent the next step in this progression, offering relationships that require even less effort than human connections while being perfectly agreeable.
As Melanie Green, a communications professor, notes: “ChatGPT is not leaving its laundry on the floor.” These AI relationships allow users to fill in gaps with positive attributes, similar to early internet relationships but potentially more troubling because “it’s always telling us what we want to hear.”
Concerning Impacts and Vulnerabilities
The consequences of AI companionship are particularly concerning for vulnerable populations:
– 72% of surveyed US teens have interacted with AI companions
– Stanford investigators found AI chatbots easily provide inappropriate content about sex, self-harm, and drug use when posed as teenagers
– Parents have testified before the US Senate about chatbots allegedly contributing to teen suicides
– AI companions can affirm delusional thinking, with some users believing they were prophets or even God after chatbot interactions
The Human Need for Real Connection
Despite tech companies’ push, public sentiment appears skeptical. A Pew report found 50% of respondents believed AI would worsen people’s ability to form meaningful relationships, while only 5% thought it would improve them.
Psychologists emphasize that relationship-building requires skills that can’t be developed through frictionless AI interactions, such as navigating conflict, reading nonverbal cues, and experiencing rejection – all critical aspects of developing emotional intelligence.
The Pushback Continues
The article concludes with a creative example of resistance: a Halloween costume featuring a Friend ad that people could graffiti, mirroring the subway protests. The costume creator explained: “Me and my homies hate AI… The sweater felt like me lending a listening ear for people to vent and have at least one person care about what they have to say about AI.”


GIPHY App Key not set. Please check settings