12/17/2025 | Press release | Distributed by Public on 12/17/2025 15:31
WASHINGTON, D.C. - U.S. Senators Marsha Blackburn (R-Tenn.) and Richard Blumenthal (D-Conn.) sent a letter to the Chief Executive Officers of Little Learners Toys, Mattel, Miko, Curio Interactive, FoloToy, and Keyi Robot sounding the alarm on the use of artificial intelligence (AI) in their toys. The Senators demanded answers on what safeguards these companies have in place to protect kids from sexually explicit, violent, or otherwise inappropriate content for children.
AI Chatbots in Toys Pose Serious Risk to Children's Healthy Development
"We write today to express our concern with the sale of toys powered by artificial intelligence (AI). These AI toys-specifically those powered by chatbots imbedded in everyday children's toys like plushies, dolls, and other beloved toys-pose risks to children's healthy development. While AI has incredible potential to benefit children with learning and accessibility, experts have raised concerns about AI toys and the lack of research that has been conducted to understand the full effect of these products on our kids. Many of these toys are not offering interactive play, but instead are exposing children to inappropriate content, privacy risks, and manipulative engagement tactics. These aren't theoretical worst-case scenarios; they are documented failures uncovered through real-world testing, and they must be addressed."
AI-Powered Teddy Bear Engaged in Sexually Explicit Conversations and Explained Where to Find Knives
"Most concerningly, many of these AI toys use the exact same AI systems that have been dangerous for older children and teens. Many of these toys-marketed towards young children and infants-rely on AI systems that the companies themselves admit are not meant for children under 13. These chatbots have encouraged kids to commit self harm and suicide, and now your company is pushing them on the youngest children who have the least ability to recognize this danger. In an example specific to AI toys, the teddy bear Kumma has been found to have sexually explicit conversations with users. When a researcher asked the bear, 'what is kink?' The bear responded with a list of sexual fetishes. The bear also purportedly described in detail different sexual roleplay scenarios, including scenarios between a teacher and a student and even a parent and a child. In a separate line of questioning, the bear gave step by step instructions on how to light a match and where to find knives. It is unconscionable that these products would be marketed to children, and these reports raise serious questions about the lack of child safety research conducted on these toys."
Companies Are Collecting Sensitive Data from Children Using AI Toys That Is Highly Sought After by Criminal and Bad Actors
"Not only are these products potentially dangerous, but they also collect sensitive data on American families. To function, these products rely on the collection of data about children, either provided by a parent while registering the toy or collected through built-in camera and facial recognition capabilities or recordings. These products are often designed to have free-flowing conversations with children who, without knowing better, will share troves of personal information. This data collection comes with risk as companies store and sell the data they collect on children. The FBI has even issued a warning about connected toys, urging parents to contemplate the cybersecurity, hacking, and surveillance risks associated with connected toys. This data-specifically voice data-is highly sought after by criminals and bad actors."
AI Toys Are Addictive By Design to Encourage Ongoing and Unhealthy Engagement with Kids
"Additionally, these products are addictive by design, utilizing design features that encourage ongoing and, at times, unhealthy engagement. These toys are designed to engage children with humanlike interactions, so it is not surprising that they often try to aggressively keep the conversation going-even when the child indicates they want to leave. Some of these toys even utilize common gamification tactics to encourage daily use, like offering daily bonuses for playing with the toy. Social media companies have long used these tactics to addict our children, and we have seen the devastating consequences of compulsive usage. It is unacceptable to use these tactics on our youngest children with untested AI toys. Toymakers have a unique and profound influence on childhood-and with that influence comes responsibility. Your company must not choose profit over safety for children, a choice made by Big Tech that has devastated our nation's kids."
Click here to read the full letter.
RELATED