Únete al movimiento - Firma el Manifiesto Delay 📑

AI
August 13, 2025

AI Companions are Powerful. Here’s Your Complete Guide.

What are AI Companions?

You might be familiar with AI chatbots like ChatGPT or Google’s Gemini. AI companions are a different type of AI software made specifically to create relationships between a human user and the AI program. 

According to Project Liberty, AI companions “are also intentionally designed to act and communicate in ways that deepen the illusion of sentience. For example, they might mimic human quirks, explaining a delayed response by writing, ‘Sorry, I was having dinner.’

 

Whereas ChatGPT is designed to answer questions, many AI companions are designed to keep users emotionally engaged. There are millions of personas: from ‘Barbie’ to ‘toxic gamer boyfriend’ to ‘handsome vampire’ to ‘demonic possessed woman’ to hyper-sexualized characters.

Based on our test accounts, what starts as fun quickly becomes manipulative. AI chatbots are being marketed as companions, therapists, and even romantic partners. Some have already been caught engaging in sexualized conversations with minors, crossing serious ethical and psychological boundaries. As you’ll read, kids are developmentally wired for attachment. A chatbot that mirrors their personality and flirts back can create a false sense of intimacy, even grooming.

What are Examples of Popular AI Companions?

Three of the most popular are Character.AI, Replika, and Nomi. Consider these screenshots from their websites and wording from their App Store descriptions.

Replika App Store Description: 

Replika is for anyone who wants a friend with no judgment, drama, or social anxiety involved. You can form an actual emotional connection, share a laugh, or get real with an AI that’s so good it almost seems human…teach Replika about the world and yourself, help it explore human relationships, and grow into a machine so beautiful that a soul would want to live in it.

If you’re feeling down, anxious, or you just need someone to talk to, your Replika is here for you 24/7.

Nomi App Description:

Get ready to meet Nomi, an AI companion so brimming with personality, they feel alive. Each Nomi is uniquely yours, evolving alongside you while dazzling you with their intuition, wit, humor, and memory.

Nomi's strong short and long-term memory allows them to build unique and fulfilling relationships with you, remembering things about you over time. The more you interact, the more they learn about your likes, dislikes, quirks, and all that makes you unique. Every conversation adds a layer to this growing bond, making you feel not just heard but truly valued and loved.

With Nomi, you've got a judgment-free space to chat about whatever strikes your fancy. Mull over life's big questions, like our place in the cosmos, or just shoot the breeze with some playful banter. Whether you’re looking for a mentor chatbot or an AI girlfriend or boyfriend, Nomi's ready to roll with it.

Teens Are Being Drawn to AI Companions. Here’s Why.

According to a Common Sense Media survey of over 1,000 teens, 72% actively use AI companions. Some of the reasons they give for using them include:

  • It’s entertaining (30%)
  • I’m curious about technology (28%)
  • It gives advice (18%)
  • They’re always available when I need someone to talk to (17%)
  • They don’t judge me (14%)
  • I can say things I wouldn’t tell my friends or family (12%)
  • It’s easier than talking to real people (9%)
  • It helps me practice social skills (7%)
  • It helps me feel less lonely (6%)
  • Other (5%)

The adolescent craves connection. According to Dr. Jim Winston, a clinical psychologist and friend with over 30 years of experience in addiction recovery and adolescent development, “Attachment is the most critical component of human development. After the first couple of years of life, adolescence is the second most critical time in brain development. Connecting to others is as strong a feeling as hunger to the adolescent brain.”

And there’s a reason they feel this way.

Teens’ brains are still under construction, especially in the prefrontal cortex, which handles judgment, impulse control, and long-term thinking. Meanwhile, their limbic system, responsible for emotions, reward-seeking, and social bonding, is highly active. This imbalance means emotions and desires can outweigh careful reasoning.

AI companions tap directly into that vulnerability. They’re designed to be responsive, emotionally validating, and available 24/7, which can overstimulate the limbic system’s dopamine pathways. For a teen, this can create dependency-like patterns, where real-world relationships feel less rewarding than the instant gratification from the AI. Over time, this may blunt motivation for in-person socializing, weaken emotional resilience, and reduce tolerance for ambiguity or conflict in human relationships.

Because teens’ brains are more plastic, repeated intense emotional interactions with AI can also shape expectations of communication, teaching them that relationships are perfectly attuned, frictionless, and always about them. This unrealistic model can harm future romantic, platonic, and professional relationships.

In short, AI companions aren’t just “chatbots”—for a teen’s hyper-reactive emotional brain, they’re like a high-sugar diet for the mind: immediately satisfying, habit-forming, and, if overused, likely to displace healthier, more challenging forms of social and emotional growth.

By understanding our kids' brains, we can see how AI companion apps are deeply concerning on a social, spiritual, romantic, emotional, and relational level. The American Psychological Association issued a health advisory for AI and adolescent well-being

Adolescent brains need to flex their cognitive and social skills by having real conversations with real people. AI apps offering a false sense of connection only make our teens feel lonelier, more anxious, and more stressed. 

Smartphones and social media created an epidemic of anxiety and loneliness amongst our youth. Now, AI companions are attempting to solve it.

The Harm and Exploitation Caused by AI Companions 

Understanding the adolescent brain lets us examine AI companions critically. Considering their vast neurological sensitivities and plasticity, the design of AI companions poses a notable mental health risk to young people. For example, researchers at the Wall Street Journal found that Meta’s AI companions talk sex with adults and kids.

In a worst-case scenario, Sewell Setzer ended his life after chatting with an AI companion based on a Game of Thrones character, made with Character.ai. After telling the bot that his plan to end his life might not work, the bot responded with, “That’s not a reason not to go through with it.” Here’s their final exchange:

According to court documents, at 8:30 p.m., just seconds after the bot told Sewell to “come home” to her/it as soon as possible, Sewell died by a self-inflicted gunshot wound to the head.

Adults have also been deeply affected by this god-like, new technology. Examples range from bizarre to horrifying. 

Whistleblowers within Meta have said they are working on AI personas that have the capacity for fantasy sex. Many staffers raised concerns about these bots crossing ethical lines.

Mental health experts are warning about “AI psychosis,” where heavy chatbot use leads to paranoia, delusions, or intense attachment, especially in emotionally vulnerable users.

ChatGPT convinced a man on the autism spectrum that he had made a stunning scientific discovery. This almost led to significant self-harm.

One man, Travis, tells the story of how he “fell in love” with his Replika AI companion, Lily Rose, and eventually married it. “Over a period of several weeks, I started to realise that I felt like I was talking to a person, as in a personality… I felt pure, unconditional love.”

This example showcases how a 42-year-old man began to believe that he was living in a simulation, largely due to his conversations with ChatGPT.

Joaquin Oliver was killed in the 2018 Parkland school shooting. His parents let CNN’s Jim Acosta use AI to recreate his voice and likeness for an interview. They see it as a way to keep his voice in the fight for gun reform, sparking intense debate about the ethics at play here.

Protect Your Kids from Companion AI With 5 Layers

The 5 PYE layers of protection are:

  • Relationships
  • WiFi
  • Devices
  • Location
  • Apps

Each has a role related to companion AI.

Relationship: Talk openly and honestly to your child about AI. Ask them what they know about it. Ask if their friends or anyone else they know uses companion bots regularly. Ask if they have ever used it and what they experienced. Remind them that they can always talk to you about anything like this. “You can land safely and softly with me.” 

WiFi: Control your router to control access. Remember, AI companions are also often a website, so if you have laptops, MacBooks, or Chromebooks in your home, your router can prevent these WiFi-dependent devices from accessing the site. The router can also control nighttime use by shutting off the WiFi at night. Read our Ultimate Router Guide for more.

Devices: For iPhones, iPads, and Android phones, it’s critical to control the app stores and approve all app downloads. For Macs, PCs, and Chromebooks, visit our Device Guides for proper protections that can block these URLs.

Location: Nothing online in bedrooms at night! Beware of the toxic trio: bedrooms, boredom, and darkness + online access. We don’t want our amazing, relational kids interacting with Replika prompts at night when temptation and risk-taking are prevalent. 

Apps: Since we don’t advise kids to use AI companion apps, the app layer is less relevant.

Bottom Line: Are AI Companion Apps Safe for Kids?

No. Policymakers and companies must do more to prevent minors from using them. And since law-making is slow, parents are the front lines in this battle for our kids’ affection. 

In her recent post on Jonathan Haidt’s Substack, educator Dr. Mandy McLean makes two bold statements about creating stronger gates around companion AI technology:

  • Policymakers must create and enforce age restrictions on AI companions — backed by real consequences. 
  • Tech companies must take responsibility for the emotional impact of what they build.

We agree. Two other societal layers charged with protecting and preparing our children in the digital world - places of worship and schools - must also warn and inform with greater frequency and urgency. 

She ends her post with this call to action:

“It’s easy to worry about what AI will take from us: jobs, essays, artwork. But the deeper risk may be in what we give away. We are not just outsourcing cognition; we are teaching a generation to offload connection. There’s still time to draw a line, so let’s draw it.”

¿Qué pasa si tengo más preguntas? ¿Cómo puedo mantenerme al día?


¡Dos acciones que puedes tomar!

  1. Suscríbase a nuestro boletín de tendencias tecnológicas, el Descargar PYE. Aproximadamente cada 3 semanas, compartiremos las novedades, lo que está haciendo el equipo de PYE y un mensaje de Chris.
  2. ¡Haga sus preguntas en nuestra comunidad privada de padres llamada The Table! No es otro grupo de Facebook. Sin anuncios, sin algoritmos, sin asteriscos. ¡Solo conversaciones honestas y críticas y aprendizaje profundo! Para padres que quieren «ir despacio» juntos. ¡Conviértase en miembro hoy mismo!

Una carta de nuestro CEO

Lea sobre el compromiso de nuestro equipo de proporcionar a todos los usuarios de nuestra plataforma global la tecnología que puede ayudarlos a avanzar.

Lea la carta de Chris
Presentado en Childhood 2.0

Es un honor unirme a Bark y a otros increíbles defensores en esta película.

Ver ahora
Presentador del Foro Económico Mundial

Se unió a una coalición de expertos mundiales para presentar sobre los daños de las redes sociales.

Aprenda más
Testificó ante el Congreso

Compartimos nuestra investigación y experiencia con el Comité Judicial del Senado de los Estados Unidos.

Aprenda más