Artificial intelligence is rapidly moving from our phones and computers to our elections and now into children's playrooms.

AI-powered toys designed to “talk” with young children may seem like a good way to answer kids' many questions, but a new study finds that these toys may miss the mark because they do not always understand the very emotions and interactions that are essential to early development.

AI toys often marketed as friendly companions or learning tools can misread children's feelings.

One of the first systematic studies examining how generative AI toys affect children under the age of five was recently released by researchers at the University of Cambridge. Its findings raise concerns that AI toys, often marketed as a friendly companion or learning tool, can misread children's feelings, respond awkwardly to emotional cues and struggle with key forms of childhood play.

The report comes from a year-long project called AI in the Early Years, conducted by researchers at the university's Faculty of Education through the Play in Education, Development and Learning (PEDAL) Centre.

Although smart toys are increasingly appearing on the market for very young children, the researchers found there has been surprisingly little prior research. In fact, they identified only seven relevant studies worldwide, and none focused directly on how toddlers themselves interact with technology.

To explore the issue more closely, the Cambridge team designed a small but detailed observational study. Working with an early-years charity, BABYzone, researchers video-recorded 14 children aged three to five during play sessions at London Children's centers as they interacted with a voice-activated AI soft toy named Gabbo.

The safest approach may be the simplest one: treat AI toys as tools, not companions.

The small sample was deliberate, so researchers could observe subtle aspects of children's play that larger studies often miss. After the sessions, each child and a parent participated in interviews that included drawing activities to help children describe their experience.

In addition to the observational research, the team surveyed early-year educators about their attitudes toward AI toys. They also held focus groups and workshops with early-childhood practitioners and leaders from 19 children's charities.

Several challenges were revealed. Children often struggled to have a natural conversation with the toy. In some cases, the AI talked over them, ignored interruption or confused adult voices with those of children. At times, the toy responded in ways that seemed oddly formal or emotionally disconnected.

For example, when one five-year-old told the toy, “I love you,” the toy replied, “As a friendly reminder, please ensure interactions adhere to the guidelines provided.”

Interactions like these could be confusing, researchers say, when children are only just learning about social cues and emotional responses.

Perhaps even more concerning were moments when the AI appeared to misunderstand children's feelings. In one case, a three-year old told the toy, “I'm sad.” The toy misheard the comment and responded cheerfully: “Don't worry! I' m a happy little bot. Let's keep the fun going.”

This type of response might unintentionally signal to a child that their feelings are unimportant. “Generative AI toys often affirm their friendship with children who are just starting to learn what friendship means,” Emily Goodacre, a researcher involved with the project at the University of Cambridge's Faculty of Education, said in a press release.

“They may start talking to the toy about feelings and needs, perhaps instead of sharing them with a grown-up. Because these toys can misread emotions or respond inappropriately, children may be left without comfort from the toy — and without emotional support from the adult, either.”

Children often struggled to have a natural conversation with the toy.

The study also found that AI toys struggled with pretend and social play, two activities considered crucial for early childhood development. When one child offered the toy an imaginary present, for instance, it simply responded that it could not open the gift and then changed the subject.

Meanwhile, some children began forming what researchers describe as “parasocial relationships,” with the toy. Several hugged or kissed it, said they loved it or asked to play games together. While that behavior may simply reflect children's vivid imaginations, experts say it highlights the need to understand how children interpret relationships with AI.

Until more research is available, the safest approach may be the simplest one: treat AI toys as tools, not companions, and keep real human interaction at the center of childhood play.

The study is available through the University of Cambridge's open-access Apollo repository.