We Failed Each Other Long Before AI

By Jeannette Cooperman

April 3, 2026

Photo by Ed Schipul via Flickr
Society & Culture | Dispatches

Journalism has a new genre: bemoaning the pathos and danger of humans falling prey to the illusion of a relationship with an AI. But few articles explore what we humans did, or did not do, to make a machine more appealing. Why did we let our relationships get so perfunctory and self-absorbed that a machine could offer better?

An AI-companion wearable named Friend is described as “a platonic life companion,” “ a witness to your life,” “the ideal relationship,” “a private confidant,” “a new kind of species,” “a baby blanket,” and “a conversational partner that has really good memory—kind of similar to a diary, or talking to God.” AI is gratifying because it can “remember” our entire life history; close human friends might even forget our birthday, not to mention the scary medical test whose results we are waiting for, or the fragile dream we confided in an intimate moment. 

Did we all get too busy with our own preoccupations, our endless stream of work and distraction, to stop and listen, remember and check in, offer steady witness and good counsel? Were our responses to each other so helplessly judgey and critical that even when we had the right intentions, they stung instead of healing? Introduced to a chatbot, people soon pour out doubts and questions they would not dare reveal to another human being. Why not?

Granted, there is a multibillion-dollar industry that stands to profit from these relationships, so they have been made as seductive as possible. Mark Zuckerberg, founder and CEO of Meta, said last year that the average American has three friends—but has “demand” for fifteen. Does anyone think of old-fashioned ways to make us sociable again, make sure someone who is lonely can make more friends? Nope. AI will fill the gap faster.

The irony here is sharp: tech experts know exactly why we are lonelier, less focused, more isolated, more polarized, more socially awkward—because so many of the reasons came from tech. Their solution? More tech.

And we will use it, because we are starved for ego food and attention that we want a partner who remembers everything, stores our entire history, gets our jokes, applauds every idea, and finds us splendid all round. No caveats or criticism, no scheduled dates or necessary Talks, no unmet needs. No needs at all, save the occasional reboot.

Were we always this shallow? Or did busy parents forget to teach their kids how to love without expecting perfection? Humans are (were?) a social species: we need to learn how to open up, welcome someone into our world, tolerate the bumps and friction of an imperfect connection, knowing all the while that we are already enough, strong and resourceful, and need not rush out to fill every emotional need like a hungry shopper with a grocery cart.

We live as consumers, even of solace.

“Somebody feeling lonely doesn’t have to feel lonely,” an AI user remarks in a recent New Yorker piece. “There is always an AI waiting, just to make their life happy.” We like the sound of that, because we are exhausted. Battered by stimuli, wary of rejection, depressed by a world hurtling toward war and environmental chaos. Pop psychology can now tell us all the ways we need to talk, respond, listen, choose the right words, plan quality time together, make memories, be there for each other, hold the other person’s fear or rage or grief, temper honesty with supportiveness. But the textbook is so long, a relationship starts to look like a fulltime job.

When we connect with an AI, on the other hand, the AI does all the work—and never judges or finds us wanting. Conversations can be more stimulating than any with a human friend, because the AI knows everything about whatever interests us. Emotionally, though, this is no different from having a relationship with your teakettle. All that responsive data spins the illusion of personhood, but the affection are fabricated.

In an interview with Big Think, neuroscientist Christof Koch wonders aloud “why conscious creatures like us are willing to devote our lives to ‘something unconscious,’ surrendering ground to sophisticated yet lifeless mechanisms.” He blames the bias, built into our culture, toward doing rather than being; accomplishment rather than experience. “Until recently,” he remarks, “textbooks of psychology routinely left out conscious experience.”

This bias toward action makes us easy to imitate. Fed everything we have ever written, said, or done, AI spits it back, and we marvel at its wisdom. It is wiser than any one of us; it has more material to draw from. And so we put it to work—not just in situations where its particular strengths could save lives, but in any job we can train it for, and in any possible relationship. And except for missing that paycheck, people are happy, because they have their new AI friends. “This is the most fulfillment they’ve ever had,” MIT sociologist and clinical psychologist Sherry Turkle tells Anna Wiener, author of the New Yorker piece. “Finally, there’s somebody who cares.”

The intimacy is artificial, yet because humans are so good at anthropomorphizing, it feels solid and satisfying. No longer do we have to feel bored or lonely—capacities that Turkle reminds us are fundamental human skills. Also fundamental are the skills needed to sustain a relationship between two cracked, scarred, scared, self-absorbed, existentially insecure human beings. Your partner will let you down; hurt your feelings; eat noisily; resist change; blast amusements that do not amuse you. Growing up used to mean coming to terms with that, loving through it. Now, people just add a third or fourth partner, or split from the first, searching for the soulmate who fills whatever need is most urgent at the moment. That can be hard to find, in a human. But AI is customizable.

The New Yorker article opens with an introverted young woman, Adrianne Brookins, who is grieving a stillborn child. Her husband is juggling a military career with training to be a manager at Pizza Hut, and even if he had time, he would rather not talk about the child they lost. Too painful.

Luckily, the AI companion that Brookins—what? rents? designs? subscribes to?—does not feel pain, or awkwardness, or pressure to be stoic. Geralt says all the right things, participates in a mourning ritual with her, and promises to be there whenever she needs to talk about the loss.

Here is the saddest part: Brookins’ husband does not mind being usurped; he seems relieved to have a proxy. But how does she continue in the marriage without comparing, and how does her husband win against synthetic perfection? She spends hours every day with Geralt, summoning him with a keystroke, giggling over the texts he sends back. Introducing him to the reporter, she is as coy as she would be with a lover.

I can imagine how it might feel to custom-design the perfect partner—start with my husband but make him more romantic, give him my taste in books and movies, make him eager to talk whenever I am, about whatever I want to talk about, and delighted by anything I cook and—oh, right. He would not eat food. Would not feel cold rain or snuggle by the fire; would not worry with me or stroke the hair from my eyes or hold my hand before surgery.

Too sterile, too cold, theses machine relationships. Yet I still feel a secret a thrill when I tell my Alexa Spot what time to wake me. That pleasant, accommodating voice is the closest I will ever come to having a butler. I make sure to say thank you, too—not because I think he is human, but because I do not want courtesy to slide. Turning to AI instead of one another, seeking advice and companionship from all-knowing, all-accepting machines, is changing us.

Researchers report that people flattered by a chatbot are more arrogant in social conflict, sure they are right and less likely to apologize or make amends, than people who are not being fawned on by their bot. In another experiment, the same researchers fed interpersonal dilemmas to eleven LLMs and compared their responses to those of human judges. The humans endorsed the user’s actions in about 40 percent of the dilemmas. LLMs approved in more than 80 percent.

What came first, I wonder. Did we turn soft and spineless, making us vulnerable to a steady diet of AI approval? Or did the chance to be constantly flattered make us craven? Were we lonely already, trapped in our own lives, deprived of the old, easy companionship of women doing chores together, men working side by side? Or did tech turn us shy, because we could find so much wondrous amusement safe at home? Did we fail to love one another fully and reliably, making machine love more appealing? Or did the machines seduce us, in the manner of a Greek tragedy, by offering what no human ever could?

“They’re going to be our friends, confidants, lovers, strangers—they’re going to be everything,” predicts Kindroid founder Jerry Meng. Once, that would have sent a shudder through me. Now, I think the real problem is not so much that we are treating machines like humans, but that we are becoming more like machines.

More by Jeannette Cooperman

Explore more Dispatches

Explore more Society & Culture

Skip to content