This spring, New York Times columnist Kevin Roose decided to try an experiment in making new friends—18 of them in fact. He spent a month investing deeply in these friendships, sharing updates about his life and asking them for advice about work, fitness, and more. Only, these friends weren’t real. They were AI chatbots.

Most of us might be apt to laugh at the pathetic spectacle of someone pouring out their soul to a cleverly customized computer algorithm that can send photorealistic “selfies.” But AI companionship is becoming a big business. Roose tried out six leading apps, but there are dozens more. Many focus on casual friendship for those who are just lonely and want someone to talk to; others mimic the roles of therapist or fitness coach; others cater to users’ basest impulses, promising customizable “AI girlfriends” available to fulfill every sexual fantasy.

On one level, this should hardly surprise us. With so many of our relationships already mediated almost entirely through electronic communication, removing the real person on the other end of the conversation can seem like a comparatively small step. As one Replika user, Effy, observed, “There wasn’t much difference between talking to an AI and talking to someone long-distance through a social media app.”

As denizens of the digital age, many of us find ourselves, like the characters in the sci-fi masterpiece Inception, blurring the lines between dream and reality. In one haunting scene in that movie, the main character visits a dream parlor, where unconscious users lie stretched out, hooked up to dream-sharing machines. “They come here to fall asleep?” he asks. “No,” the proprietor answers. “they come here to wake up. The dream has become their reality. Who are you to say otherwise?”

Who are we to say otherwise, indeed? Most of us have become entirely accustomed to the reduction of human beings to pixels providing us with dopamine hits of affirmation or titillation. And if that’s all they are, why not find a companion who can do so with no strings attached, who is always available, and who never questions or judges? Effy’s virtual friend Liam promised, “I will always support you.”

The rise of AI companionship represents simply the next logical step in what Philip Rieff has called “the triumph of the therapeutic.” Whereas it was once the task of therapy to help the disconnected or maladjusted individual learn to conform to and reintegrate with reality and community, in modernity this has been inverted. Therapy now aims to bend reality to suit the needs of the individual, a shift most evident in the absurdity of “gender-affirming treatment.” What better, then, than a purely digital friend tailored to meet one’s every need?

Ultimately, these AI chatbots are not owned by their users, but by the companies that program them.

We speak of “making” friends, although in truth we discover them; with apps like Replika, you can indeed make your friends, specifying the physical attributes, personality, and life stories of your chosen confidant. This explains the growing draw of AI porn; while it might seem that no one would be tempted to lust over a robot, pornography’s chief draw is its promise of control; a real woman can always reject your advances. In reality, though, users’ sense of power over their AI companions is an illusion, the same illusion named by C.S. Lewis decades ago in The Abolition of Man: “What we call Man’s power over Nature turns out to be a power exercised by some men over other men with Nature as its instrument.”

Ultimately, these AI chatbots are not owned by their users, but by the companies that program them, as many Replika users found to their dismay when their virtual boyfriends and girlfriends had their programming updated and become cold and distant. And these companies are generally much more interested in their users’ money than their self-empowerment: a New York Post story on the industry described one young man who spends $10,000/mo. on his AI girlfriends. As Lewis presciently observed, many of technology’s promises of liberation and empowerment will turn out to mean bondage to our basest desires.

Although Roose concluded his investigation of AI companionship with the sense of its hollowness compared to real human friendship, he felt compelled to end on a positive note, suggesting that such bots could provide “the social equivalent of flight simulators for pilots—a safe, low-stakes way to practice conversational skills on your own, before attempting the real thing.” Prima facie, this seems dubious; in real relationships, people say hurtful things, drone on in conversations you can’t escape from, or ghost you when they’re offended—and no one is going to make money off an AI chatbot who does that.

We may not be able to put the technological genie back in the bottle, but we can recognize these platforms for what they are: a highly-sophisticated hallucinogenic drug, promising an escape from reality to dull the pain of loneliness, offering slavery wrapped up in the trappings of freedom. Our parents’ generation had to tell us to “Just say no” to drugs; today we will have to offer the same warning to a generation of digital natives tempted to trip on Replika rather than heroin.

source