No Mark Zuckerberg, AI 'friends' are not good for mental health
Think you could use a few more friends? Meta CEO Mark Zuckerberg says AI will do the trick. In a recent interview with podcaster Dwarkesh Patel, the Silicon Valley titan said the average American has fewer than three friends but a desire to have “something like fifteen.”
Zuckerberg thinks computer code will fill the gap: “The average person wants more connectivity, connection than they have,” he observed. “As the personalization loop kicks in, and the AI just starts to get to know you better, I think that will be just really compelling.”
It’s interesting advice from a guy who heads up two of the largest platforms on the planet for bringing people together.
It’s also an admission from Zuckerberg that chatting with real people isn’t cutting it anymore.
His solution? More technology, not less. Meta has made billions of dollars monetizing our attention. Why not monetize our loneliness, too?
Turns out it’s a bad time to tell us to make AI friends when we’re already struggling to navigate our digital lives. In 2023, US Surgeon General Vivek Murthy warned of an epidemic of loneliness and isolation.
“One in two adults in America is living with measurable levels of loneliness,” Murthy reported, “but the numbers are even higher among young people.” He pointed to social media and online interactions as a driving factor.
And we’re not just lonely. Rates of depression and anxiety are on the rise, too, again particularly in our youth.
According to Centers for Disease Control and Prevention data published last month, the prevalence of depression in people age 12 and older has nearly doubled in a decade, jumping from 8.2% between 2013 and 2014 to 13.1% between 2021 and 2023.
Of course, Zuckerberg knew his products were negatively impacting young people years ago.
In 2021, The Wall Street Journal revealed that Facebook, which owns Instagram, had internal evidence showing Instagram use is linked with poorer mental health, particularly among young women.
Facebook buried its findings and failed to address the problem.
Zuckerberg doesn’t seem to understand that the struggle is real for millions of Americans who are finding it anything but easy to manage their well-being around constant online stimulation: “People are smart. They know what’s valuable in their lives,” Zuckerberg told Patel. “I think people have a good sense of what they want.”
But is that true? If our epidemics of loneliness and depression are anything to go by, we’re not doing all that great. Our health is suffering. Our relationships are suffering. Our communities are suffering. Zuckerberg’s answer to our technology-induced problems?
AI friends. If we can’t cope with our online interactions, let’s add more online interactions —with non-human AI models — to make things better.
But AI friends won’t make things better.
Here are a few reasons why.
First, they’re not physical. Conversations with AI chatbots are purely mental exercises. But we’re not just floating heads. To be human is to occupy a physical body, and we build relationships with others by doing things with them.
We hike, we shop, we build, we eat, we laugh, we cry. We occupy the same space and time. We are physically present with other living beings. They engage all our senses and we theirs. “Social connection is a fundamental human need,” notes the surgeon general’s advisory, “as essential to survival as food, water, and shelter.”
Second, AI friends don’t take work. AI chatbots are available on demand, 24 hours a day. They’re never busy or unavailable. They don’t rely on us for anything. They’re at our bidding, whatever we desire. That may sound appealing, particularly if we’re having problems connecting with other people.
But while interacting with an AI may be largely frictionless, the practice won’t help us build strong real-world bonds. I
t might provide some good advice, like a self-help book, but the work and effort required for lasting friendships require real-world opportunities to practice.
Because even the hard parts of human relationships can be good for us: the mistakes, misunderstandings, and letdowns. These are lessons unlikely to be found in the validation loops of an AI chatbot.
Third, they’re not real. This is obvious, of course, but AI models can lull us into forgetting this fact by resembling real people. Behind that AI companion or therapist is an algorithmic engine that uses predictive analysis to respond to user inputs.
It speaks human-sounding lingo because it has been trained on countless examples of real human interactions.
It’s a bit like hearing an echo from a conversation between two people you’ll never know and who will never know you. Could anything be lonelier?
Relational AI models — AI bots designed to maintain personal relationships with us — take on the persona of a caring, interested human being, but they are, in fact, objects — thousands of lines of programming code.
In Zuckerberg’s world, most of our friends would be AI. But when we’re interacting with AI, we’re still very much alone. If you want more friends, don’t waste your time with an AI model.
Take your stand in reality with the people around you. Seek out new circles. Introduce yourself regularly. Remember that the people you interact with are likely as relationally starved as you are. Give them the gift of you, face-to-face and in real time. And enjoy what comes back to you in return.
Andrew McDiarmid is a senior fellow at The Discovery Institute.