A new survey has revealed that half of all children in the United Kingdom now own an AI-enabled toy or device, despite widespread parental concerns about safety and data privacy. The findings underscore a growing disconnect between the rapid adoption of artificial intelligence in children's products and the availability of clear guidance for families on their safe use.
Survey Findings on AI Toy Ownership
The poll, conducted by the British Standards Institution (BSI) to mark its 125th anniversary, found that 50 per cent of children aged 16 and under have been given at least one AI-powered toy or learning device, such as interactive robots or smart tablets. However, nearly half of parents (47 per cent) believe their child would be better off growing up without any access to AI. A significant 75 per cent express concern that internet-connected AI toys could expose their children to unwanted content or data vulnerabilities.
Parental Paradox: Safety Concerns vs. Real-World Risks
Despite these worries, the survey uncovered a striking paradox: 54 per cent of parents indicated they would be more likely to allow their child to play with an AI-enabled toy unsupervised than to play outside on the street without an adult present (51 per cent), or to let their children visit local shops or parks alone (46 per cent). This suggests that while parents are wary of digital risks, they perceive AI toys as safer than certain real-world activities.
Children's Understanding of AI
Further concerns emerged regarding children's comprehension of AI. Fewer than half of parents (46 per cent) believe their child could distinguish between a human and an AI, and just 43 per cent think their child could accurately assess information provided by an AI chatbot. A substantial 78 per cent of parents are worried that these devices might respond to sensitive questions in ways they cannot oversee, while 70 per cent fear AI praising or criticising behaviour without understanding its appropriateness or safety.
Demand for Clearer Safeguards
The demand for clearer safeguards is evident, with nine in 10 parents (91 per cent) stating that a recognised safety certification or mark for AI toys would be important, and nearly a third (29 per cent) deeming it essential. More than 80 per cent (83 per cent) believe manufacturers should adhere to established standards or codes of conduct, and 72 per cent want clearer information on whether products meet safety or security requirements.
Currently, while traditional toy safety marks like CE or UKCA address physical risks such as choking, and some devices comply with information security standards like ISO 27001, there is no widely recognised, dedicated framework specifically addressing the unique safety, behavioural, and developmental considerations posed by AI in toys. The UK government has recently published its proposed new product safety framework, which aims to address risks of harm linked to AI, including in children's toys.
Expert and Political Reactions
Laura Bishop, BSI's digital sector lead for cybersecurity and AI, commented: "AI-enabled toys are quickly becoming part of everyday childhood, both in play and learning, and they do have the potential to offer real benefits in terms of development or access to information. However, the frameworks to support safe, transparent and age-appropriate use are still catching up. Our research shows that while parents are increasingly introducing these technologies into their children's lives, they are doing so without clear, consistent information about how they work or what safeguards are in place. As the AI toys and devices available to children evolve and become more sophisticated, it is essential that the frameworks around them develop at the same pace."
MP Emily Darlington, a member of the Science, Innovation and Technology Committee, echoed these concerns: "These findings clearly show that AI toys are becoming part of children's everyday lives far more quickly than Government is able to regulate them or give parents necessary information on the impact they will have on a child's development. Parents are currently being asked to make decisions about complex technology without clear, accessible guidance on how it works, what data it collects, or how to use it safely. It's our job as the Government to help parents feel informed and confident about the choices they make for their children, rather than leave them to figure it out on their own. All of this underlines the need for clearer standards and guardrails, so innovation doesn't run ahead of safety. Otherwise, we risk children growing up in a world with powerful technologies that we still don't fully understand – and it may be too late."
The survey was conducted by Focaldata, polling 1,000 UK parents in April.



