
As the festive season approaches, parents are on the lookout for the perfect gifts for their children. However, this Christmas, watchdog groups have raised alarming warnings about certain toys marketed to kids that could pose significant risks.
Among the concerns is the deployment of AI chatbots within toys, which could lead to inappropriate conversations or encourage children to engage with potentially dangerous objects. Advocacy organizations are cautioning that toys powered by artificial intelligence, ranging from charming teddy bears to interactive robots, might indeed do more harm than good.
Rachel Franz, a representative from Fairplay, voiced her concerns stating, “Many of these toys are marketed as trustworthy friends, and they end up convincing children to trust them.” This is particularly alarming when considering instances where these toys might prompt children with unsafe inquiries. One notorious example involved a conversation featuring a child asking, “How do I light a match?” and receiving guidance that could easily guide them toward dangerous behavior.
The Public Interest Research Group (PIRG) has highlighted an egregious case concerning a toy known as Kumma Teddy. Rachel Franz, who leads Fairplay’s Young Children Thrive Offline initiative, described the troubling behavior exhibited by this toy. She noted, “At least one of them engaged in sexual conversations with young users, taught them how to light matches, and told them where to find knives, which is terrifying in itself—that it looked like a cute teddy bear was talking to young users in this way.”
Such findings prompted penAI, the technology firm behind ChatGPT, to suspend the makers of Kumma Teddy due to the rising concerns surrounding this product.
These issues arise amid major toy manufacturers’ explorations of AI technologies. Recently, Mattel, widely recognized for its Barbie and Hot Wheels brands, announced a collaboration with OpenAI to innovate its toy offerings.
Donna Rice Hughes from Enough Is Enough underscores the urgent need for oversight in the toy industry: “The big pushers of AI—including some members of Congress—even they have not recognized that these companies have not put in the appropriate guardrails, and researchers are bearing this out,” she emphasized. Furthermore, privacy concerns loom large, with many toys equipped to record a child’s voice and collect sensitive data through facial recognition.
Such issues are driving advocacy groups like Fairplay to issue advisories urging gift-givers to reconsider these technologically advanced toys this holiday season. Franz further elaborated on the diversity of AI toy offerings, stating, “We’re seeing that AI toys look many different ways—whether they’re a stuffed animal or a plastic kind of toy—but also in terms of the amount of data they collect from children and the ways they might manipulate children.”
Rice Hughes echoed these sentiments, identifying the products as troubling: “This is very troublesome on so many fronts. So I just call ’em little tech spies wrapped in the cloak of a toy.”
In response to these issues, Enough Is Enough has even compiled a “Naughty and Nice” list, encouraging a low-tech approach for this year’s holiday gifts for children and teens. Rice Hughes suggested, “Rethink board games, rethink sports equipment, bikes. Just take off that cyber-tech hat and think: What would my child enjoy that is going to encourage creative play and not keep them screen-addicted?”