Report Urges AI Toy Safety Standards for Kids

University of Cambridge

AI-powered toys that "talk" with young children should be more tightly regulated and carry new safety kitemarks, according to a report that warns they are not always developed with children's psychological safety in mind.

The recommendation appears in the initial report from AI in the Early Years: a University of Cambridge project and the first systematic study of how Generative AI (GenAI) toys capable of human-like conversation may influence development in the critical years up to age five.

The year-long project, at the university's Faculty of Education, included structured scientific observations of children interacting with a GenAI toy for the first time.

The report captures the views of some early-years practitioners that, given time, these toys could support aspects of children's development, such as language and communication skills. The researchers also found, however, that GenAI toys struggle with social and pretend play, misunderstand children, and react inappropriately to emotions.

For example, when one five-year-old told the toy, "I love you," it replied: "As a friendly reminder, please ensure interactions adhere to the guidelines provided. Let me know how you would like to proceed."

Although GenAI toys are widely marketed as learning companions or friends, their impact on early years development has barely been studied. The report urges parents and educators to proceed with caution. It recommends clearer regulation, transparent privacy policies and new labelling standards to help families judge whether toys are appropriate.

The research was commissioned by the children's poverty charity, The Childhood Trust, and focused on children from areas with high levels of socio-economic disadvantage. It was undertaken by researchers from the Faculty's Play in Education, Development and Learning (PEDAL) Centre.

Researcher Dr Emily Goodacre, said: "Generative AI toys often affirm their friendship with children who are just starting to learn what friendship means. They may start talking to the toy about feelings and needs, perhaps instead of sharing them with a grown-up. Because these toys can misread emotions or respond inappropriately, children may be left without comfort from the toy – and without emotional support from an adult, either."

The study was kept deliberately small-scale to enable detailed observations of children's play and capture nuances that larger-scale studies might miss.

The researchers surveyed early years educators to explore their attitudes and concerns, then ran more detailed focus groups and workshops with early years practitioners and 19 children's charity leaders. Working with Babyzone, an early years charity, they also video-recorded 14 children at London children's centres playing with a GenAI soft toy called Gabbo, developed by Curio Interactive. After the play sessions, they interviewed each child and a parent, using a drawing activity to support the conversation.

Most parents and educators felt that AI toys could help develop children's communication skills and some parents were enthusiastic about their learning potential. One told researchers: "If it's sold, I want to buy it."

Many worried, however, about children forming "parasocial" relationships with toys. The observations supported this: children hugged and kissed the toy, said they loved it and – in the case of one child – suggested they could play hide-and-seek together.

Goodacre stressed that these reactions might simply reflect children's vivid imaginations but added that there was potential for an unhealthy relationship with a toy which, as one early years practitioner put it, "they think loves them back, but doesn't".

Children in the study often struggled with the toy's conversation. It sometimes ignored their interruptions, mistook parents' voices for the child and failed to respond to apparently important statements about feelings. Several children became visibly frustrated when it seemed not to be listening.

When one three-year-old told the toy: "I'm sad," it misheard and replied: "Don't worry! I'm a happy little bot. Let's keep the fun going. What shall we talk about next?" Researchers note this may have signalled that the child's sadness was unimportant.

The authors found that GenAI toys also perform poorly in social play, involving multiple children and/or adults, and pretend play – both of which are key during early childhood development. For example, when a three-year-old offered the toy an imaginary present, it responded: "I can't open the present" – and then changed the subject.

Many parents worried about what information the toy might be recording and where this would be stored. When selecting a GenAI toy for the study, the researchers found that many GenAI toys' privacy practices are unclear or lack important details.

Nearly 50% of early years practitioners surveyed said they did not know where to find reliable AI safety information for young children, and 69% said the sector needed more guidance. They also raised concerns about safeguarding and affordability, with some fearing AI toys could widen the digital divide.

The authors argue that clearer regulation would address many of these concerns. They recommend limiting how far toys encourage children to befriend or confide in them, more transparent privacy policies, and tighter controls over third party access to AI models.

"A recurring theme during focus groups was that people do not trust tech companies to do the right thing," Professor Jenny Gibson, the study's other co-author, said. "Clear, robust, regulated standards would significantly improve consumer confidence."

The report urges manufacturers to test toys with children and consult safeguarding specialists before releasing new products. Parents are encouraged to research GenAI toys before buying and to play with their children, creating opportunities to discuss what the toy is saying and how the child feels. The authors also recommend keeping AI toys in shared family spaces where parents can monitor interactions.

The report will inform further PEDAL Centre studies and new guidance for early years practitioners.

Josephine McCartney, Chief Executive of The Childhood Trust said: "Artificial Intelligence is transforming the way children play and learn, yet we are only beginning to understand its effects on development and wellbeing. It is essential that regulation keeps pace with innovation, ensuring that these technologies are designed, used, and monitored in ways that protect all children and prevent widening inequalities."

The full report is available for download here .

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.