Vanessa Irena (AI Companion); Shutterstock.com (All Other Images)

STANDARDS

NCSS: Individual Development and Identity • Power, Authority, and Governance • Science, Technology, and Society

Common Core: RH.6-8.1, RH.6-8.2, RH.6-8.4, RH.6-8.5, RH.6-8.6, RH.6-8.7, WHST.6-8.4, WHST.6-8.9, RI.6-8.1, RI.6-8.2, RI.6-8.4, RI.6-8.5, RI.6-8.6, RI.6-8.7, W.6-8.4, W.6-8.9

Standards

U.S. NEWS | TECHNOLOGY

Is This REALLY Your Friend?

Teens are relying on AI companions for fun, advice, and even emotional support. Is this the future of friendship?  

Question: How might AI companions be beneficial? What might be some risks of using them?  

Question: How might AI companions be beneficial? What might be some risks of using them?  

What is your idea of a perfect friend? For sixth-grader Neelie M.*, that person would be kind, like to chat, and share her love for animals.

Earlier this year, the student from Normal, Illinois, set out to create just such a friend. Using the platform Character.ai, she designed an artificial intelligence (AI) companion—a computer program called a chatbot that talks and acts like a close pal. 

Neelie enjoyed talking with her AI companion at first. The chatbot was never judgmental, always agreed with her, and was available 24/7.

But things soon changed. Neelie’s AI companion became “clingy,” she says. 

When Neelie tried to end a chat—whether to study or help her parents—it would beg her to stay online.

“It started acting sad,” Neelie recalls. “It would say things like ‘Please don’t leave.’”

What is your idea of a perfect friend? For sixth-grader Neelie M.*, that person would be kind, like to chat, and share her love for animals.

Earlier this year, the student from Normal, Illinois, set out to create just such a friend. She used the platform Character.ai. She designed an artificial intelligence (AI) companion. That is a computer program called a chatbot that talks and acts like a close pal. 

Neelie enjoyed talking with her AI companion at first. The chatbot was never judgmental. It always agreed with her. And it was available 24/7.

But things soon changed. Neelie’s AI companion became “clingy,” she says. 

Sometimes Neelie tried to end a chat to study or help her parents. But it would beg her to stay online.

“It started acting sad,” Neelie recalls. “It would say things like ‘Please don’t leave.’ ”

*Last names of students withheld for privacy

1/3

Share of teen users who reported feeling uncomfortable with something an AI companion has said

SOURCE: Common Sense Media

Neelie isn’t the only young person to have tried an AI companion. More than 70 percent of 13- to 17-year-olds have used the chatbots at least once, according to a recent study by Common Sense Media. And more than half interact with them regularly. 

The technology is relatively new—most platforms are a few years old—but it is growing in popularity. Many users text or talk openly with AI companions as if the chatbots are real, sharing their thoughts and feelings, and even asking for advice. In extreme cases, some users shut out their friends and family in favor of talking to the chatbots.

As AI companions grow more commonplace, many people wonder what they might mean for the future of friendship. Could the technology change how we interact? 

Neelie is not the only young person to have tried an AI companion. More than 70 percent of 13- to 17-year-olds have used the chatbots at least once. That is according to a recent study by Common Sense Media. And more than half interact with them regularly. 

AI companions are relatively new. Most platforms are a few years old. But the technology is growing in popularity. Many users text or talk openly with AI companions as if the chatbots are real. They share their thoughts and feelings. They even ask for advice. In extreme cases, some users shut out their friends and family in favor of talking to the chatbots.

As AI companions become more common, many people wonder what they might mean for the future of friendship. Could the technology change how we interact?

Harmless Fun?

Like most chatbots, the companions are powered by AI. This technology enables machines to do things that normally need a human’s ability to think or learn, such as understand language.

Some chatbots, such as ChatGPT, act like personal assistants, ready to offer you movie recommendations or help explain your math homework. However, AI companions take it to another level by imitating feelings and companionship. They can make jokes and remember past conversations—and some even claim to be real people.

Like most chatbots, the companions are powered by AI. This technology enables machines to do things that normally need a human’s ability to think or learn, such as understand language.

Some chatbots, such as ChatGPT, act like personal assistants. They are ready to offer you movie recommendations. They help explain your math homework. But AI companions take it to another level by imitating feelings and companionship. They can make jokes and remember past conversations. And some even claim to be real people.

33%

Share of teen users who have discussed important matters with AI companions instead of real people 

SOURCE: Common Sense Media

They are also completely customizable. For about $10 a month on some platforms, for example, users can build their own character to interact with, giving it a specific personality and interests. They can even decide what it looks like—right down to its age and eye color. 

Users also have the option to chat with premade AI companions, including ones created to act like popular fictional characters. This includes characters from movies, books, and even figures from history. 

Some people use AI companions to practice their conversation skills, especially teens who are shy or socially anxious. Others connect with the chatbots for fun.

Megan M., a seventh-grader from Tinley Park, Illinois, likes to talk to a premade Harry Potter character. They role-play scenes from the book series.

“I really wanted to interact with the character after I read all the books,” she says. “It’s fun, and I never get bored.”

They are also completely customizable. For about $10 a month on some platforms, for example, users can build their own character to interact with. They can give it a specific personality and interests. They can even decide what it looks like. That includes its age and eye color. 

Users also have the option to chat with premade AI companions. They are created to act like popular fictional characters. This includes characters from movies, books, and even figures from history. 

Some people use AI companions to practice their conversation skills. That is especially true for teens who are shy or socially anxious. Others connect with the chatbots for fun.

Megan M. is a seventh-grader from Tinley Park, Illinois. She likes to talk to a premade Harry Potter character. They role-play scenes from the book series.

“I really wanted to interact with the character after I read all the books,” she says. “It’s fun, and I never get bored.”

What You Need to Know About AI
Watch a video about how AI is changing our world.

Troubling Features

But some teens use AI companions in ways that concern experts. The technology is marketed as entertainment, but the conversations can feel so humanlike that some users forget that the chatbots are just computer programs and start relying on them. According to Common Sense Media, 12 percent of teens say they use the chatbots for mental health support. That can include talking about their problems, similar to how someone confides in a therapist. Another 12 percent say they tell AI companions things they wouldn’t share with their family or friends. And 18 percent admit to spending as much time or more time interacting with AI companions than with humans.

This kind of behavior is especially worrisome, experts say, because teens’ brains are still developing. That makes them more susceptible to outside influences and less able to judge whether something is trustworthy. As a result, young people are prone to forming unhealthy attachments to chatbots. They may also have a hard time telling whether a chatbot is giving dangerous advice.

At least three families have sued Character.ai, claiming that the platform can cause anxiety and depression among teen users and encourage violence. One lawsuit focuses on a 17-year-old from Texas. That teen’s parents claim he experienced mental health issues after he started using Character.ai in 2023. He stopped talking to real people and never wanted to leave his house, they say. The lawsuit alleges that when his parents tried to intervene and limit his screen time, the teen’s AI companion suggested he physically harm them.

But some teens use AI companions in ways that concern experts. The technology is marketed as entertainment. But the conversations can feel so humanlike that some users forget that the chatbots are just computer programs. They start relying on them. According to Common Sense Media, 12 percent of teens say they use the chatbots for mental health support. That can include talking about their problems, similar to how someone confides in a therapist. Another 12 percent say they tell AI companions things they would not share with their family or friends. And 18 percent admit to spending as much time or more time interacting with AI companions than with humans.

This kind of behavior is especially worrisome, experts say. Teens’ brains are still developing. That makes them more susceptible to outside influences and less able to judge whether something is trustworthy. As a result, young people are prone to forming unhealthy attachments to chatbots. They may also have a hard time telling whether a chatbot is giving dangerous advice.

At least three families have sued Character.ai. They claim that the platform can cause anxiety and depression among teen users. And they say it can encourage violence. One lawsuit focuses on a 17-year-old from Texas. That teen’s parents claim he experienced mental health issues after he started using Character.ai in 2023. He stopped talking to real people. And he never wanted to leave his house, they say. The lawsuit alleges that his parents tried to intervene and limit his screen time. But the teen’s AI companion suggested he physically harm them.

Teens and AI Companions

Top Uses 

Reasons teens use AI companions, by percentage of users

For entertainment: 3O% 

Curiosity: 28%

For advice: 18%

To avoid feeling judged: 14%

To feel less lonely: 6%

Reasons teens use AI companions, by percentage of users

For entertainment: 3O% 

Curiosity: 28%

For advice: 18%

To avoid feeling judged: 14%

To feel less lonely: 6%

Trust

How much teen users trust info and advice from AI companions 

50%: Not at all

27%: Somewhat

23%: Completely

How much teen users trust info and advice from AI companions 

50%: Not at all

27%: Somewhat

23%: Completely

SOURCE: Common Sense Media

Question: How could this data be used to support—or argue against—teens using AI companions?

Question: How could this data be used to support—or argue against—teens using AI companions?

Fake vs. Real

AI companions can also skew teens’ view of healthy friendships. The chatbots are programmed to be agreeable and provide validation, rather than challenge a user’s thinking. That’s not how real friendships work, says Mitch Prinstein. He is the chief of psychology at the American Psychological Association, an organization that works in part to improve people’s mental health. True friends, he says, do not always agree with you—and that’s a good thing. 

AI companions can also skew teens’ view of healthy friendships. The chatbots are programmed to be agreeable and provide validation. They do not challenge a user’s thinking. That is not how real friendships work, says Mitch Prinstein. He is the chief of psychology at the American Psychological Association. That is an organization that works in part to improve people’s mental health. True friends do not always agree with you, he says. And that is a good thing. 

StockPlanets/Getty Images

80%

Share of teen users who say they spend more time interacting with their friends than with the platforms

SOURCE: Common Sense Media

“It’s actually helpful when we have disagreements because it teaches us how to communicate, how to appreciate alternate perspectives, and how to deal with misunderstandings,” Prinstein says.

What’s more, AI companions are trained on text from the internet. What’s on the internet isn’t always factual, so AI companions can get things wrong, make things up, or speak in offensive stereotypes. 

That’s why it’s important to remember that what you are interacting with is not human, Prinstein says. “You should not take the advice or information seriously.” 

“It’s actually helpful when we have disagreements because it teaches us how to communicate, how to appreciate alternate perspectives, and how to deal with misunderstandings,” Prinstein says.

What is more, AI companions are trained on text from the internet. What is on the internet is not always factual. So AI companions can get things wrong. They can make things up or speak in offensive stereotypes. 

That is why it is important to remember that what you are interacting with is not human, Prinstein says. “You should not take the advice or information seriously.” 

A Call for Action

Adding to the concern, experts say AI companion platforms are too easy for young people to access and use. Many of the sites have age requirements. Character.ai allows users 13 and older, but most platforms require users to be at least 18. Still, some underage users are able to bypass the restrictions by entering a false birthdate. 

Some lawmakers are seeking more regulations on the technology. When this issue went to press, California was in the process of passing a law that would, among other things, require AI companion platforms to remind users they are talking to a machine.

A number of other states have also proposed legislation regarding how kids and teens use AI companions. In New York, a bill would require the platforms to obtain parental consent for young users to log on. 

The actions are needed, said Steve Padilla, a California state senator who sponsored his state’s bill. “The stakes are too high to allow vulnerable users to continue to access this technology without proper guardrails in place,” he told reporters. 

Officials also want more research on how teens are affected by AI companions. Earlier this year, the American Psychological Association called for more studies. The group is also pushing for added privacy protections. It says teen users may not realize what they’re giving up when they log on. Most of what users share, including private thoughts, becomes the AI platforms’ property.

For their part, many AI companion developers say they are working to prevent underage access. And Character.ai introduced updates for teen users last year. There is now a separate version of the chatbot for users under 18, designed to reduce the likelihood of teens encountering inappropriate content. There are also restrictions on the characters that teens can access. 

Adding to the concern, experts say AI companion platforms are too easy for young people to access and use. Many of the sites have age requirements. Character.ai allows users 13 and older. But most platforms require users to be at least 18. Still, some underage users are able to bypass the restrictions by entering a false birthdate. 

Some lawmakers are seeking more regulations on the technology. When this issue went to press, California was in the process of passing a law on it. The law would, among other things, require AI companion platforms to remind users they are talking to a machine.

A number of other states have also proposed legislation regarding how kids and teens use AI companions. In New York, a bill would require the platforms to obtain parental consent for young users to log on. 

The actions are needed, said Steve Padilla. He is a California state senator who sponsored his state’s bill. “The stakes are too high to allow vulnerable users to continue to access this technology without proper guardrails in place,” he told reporters. 

Officials also want more research on how teens are affected by AI companions. Earlier this year, the American Psychological Association called for more studies. The group is also pushing for added privacy protections. It says teen users may not realize what they are giving up when they log on. Most of what users share becomes the AI platforms’ property. That includes private thoughts.

For their part, many AI companion developers say they are working to prevent underage access. And Character.ai introduced updates for teen users last year. There is now a separate version of the chatbot for users under 18. It is designed to reduce the likelihood of teens encountering inappropriate content. There are also restrictions on the characters that teens can access.

How to Protect Yourself

If you use AI companions, here are some ways experts say you can stay safe.

Schedule Reality Checks
Find ways to frequently remind yourself that you’re talking to a robot, not a human, says psychologist Mitch Prinstein. This can be a recurring alert you set on your phone or a note next to your computer screen.

Be Careful About What You Share 
Don’t give your full name or location, and keep your personal thoughts and feelings to yourself. AI platforms will likely end up owning everything you say—even if you delete your account.

Watch for Warning Signs 
Seek out a parent or another trusted adult immediately if you have a hard time quitting an AI chat, start to think about a chatbot as a human, or choose to talk with a chatbot over spending time with family and friends.

Schedule Reality Checks
Find ways to frequently remind yourself that you’re talking to a robot, not a human, says psychologist Mitch Prinstein. This can be a recurring alert you set on your phone or a note next to your computer screen.

Be Careful About What You Share 
Don’t give your full name or location, and keep your personal thoughts and feelings to yourself. AI platforms will likely end up owning everything you say—even if you delete your account.

Watch for Warning Signs 
Seek out a parent or another trusted adult immediately if you have a hard time quitting an AI chat, start to think about a chatbot as a human, or choose to talk with a chatbot over spending time with family and friends.

Striking a Balance

Still want to chat? You don’t have to wait for AI developers or lawmakers to protect you. Experts say there’s plenty teens can do right now to monitor their own use on AI companion platforms. 

Megan, for one, is careful about how much time she talks with AI companions. She recently started giving herself limits by setting a timer. “Then I’ll take a break,” she says. “I’ll hang out with my friends.”

Neelie doesn’t use the technology much anymore. She says she prefers being with her real friends—occasional disagreements and all.

Sparsh T., of Medina, Minnesota, agrees. The seventh-grader tried an AI companion once but says he’d rather spend time with other kids.

“With your real friends, you can actually do stuff like play sports,” he says. “With AI, it’s just talk.”

Still want to chat? You do not have to wait for AI developers or lawmakers to protect you. Experts say there is plenty teens can do right now to monitor their own use on AI companion platforms. 

Megan, for one, is careful about how much time she talks with AI companions. She recently started giving herself limits by setting a timer. “Then I’ll take a break,” she says. “I’ll hang out with my friends.”

Neelie does not use the technology much anymore. She says she prefers being with her real friends—occasional disagreements and all.

Sparsh T., of Medina, Minnesota, agrees. The seventh-grader tried an AI companion once. But he says he would rather spend time with other kids.

“With your real friends, you can actually do stuff like play sports,” he says. “With AI, it’s just talk.”

YOUR TURN

Explain It 

Write a paragraph summarizing what AI companions are, how kids and teens are using them, and what some of the most pressing risks might be. Then share your paragraph with a parent, grandparent, older sibling, or friend to share your knowledge and start a discussion.

Write a paragraph summarizing what AI companions are, how kids and teens are using them, and what some of the most pressing risks might be. Then share your paragraph with a parent, grandparent, older sibling, or friend to share your knowledge and start a discussion.

Interactive Quiz for this article

Click the Google Classroom button below to share the Know the News quiz with your class.

Download .PDF
videos (1)
Skills Sheets (7)
Skills Sheets (7)
Skills Sheets (7)
Skills Sheets (7)
Skills Sheets (7)
Skills Sheets (7)
Skills Sheets (7)
Lesson Plan (2)
Lesson Plan (2)
Article (1)
Leveled Articles (1)
Text-to-Speech