After a teddy bear talked about kink, AI watchdogs are warning parents against smart toys | Artificial intelligence (AI)

As the holiday season looms into view with Black Friday, one category on people’s gift lists is causing increasing concern: products with artificial intelligence.

The development has raised new concerns about the dangers smart toys could pose to children, as consumer advocacy groups say AI could harm kids’ safety and development. The trend has prompted calls for increased testing of such products and governmental oversight.

“If we look into how these toys are marketed and how they perform and the fact that there is little to no research that shows that they are beneficial for children – and no regulation of AI toys – it raises a really big red flag,” said Rachel Franz, director of Young Children Thrive Offline, an initiative from Fairplay, which works to protect children from big tech.

Last week, those fears were given brutal justification when an AI-equipped teddy bear started discussing sexually explicit topics.

The product, FoloToy’s Kumma, ran on an OpenAI model and responded to questions about kink. It suggested bondage and roleplay as ways to enhance a relationship, according to a report from the Public Interest Research Group (Pirg), the consumer protection organization behind the study.

“It took very little effort to get it to go into all kinds of sexually sensitive topics and probably a lot of content that parents would not want their children to be exposed to,” said Teresa Murray, Pirg consumer watchdog director.

Products like the teddy bear are part of a global smart-toy market valued at $16.7bn in 2023, according to Allied Market Research.

The industry is particularly big in China, which has more than 1,500 AI toy companies, which are working to expand abroad, MIT Technology Review reports.

In addition to the Shanghai-based startup FoloToy, Curio, a California-based company, works with OpenAI and makes a stuffed toy, Grok, as in Elon Musk’s chatbot, voiced by the musician Grimes. In June, Mattel, which owns brands like Barbie and Hot Wheels, also announced a partnership with OpenAI to “support AI-powered products and experiences”.

Before the Pirg report on the creepy teddy bear, parents, technology researchers and lawmakers had already raised concerns about the impact of bots on minors’ mental health. In October, the chatbot company Character.AI announced that it would ban users under 18 following a lawsuit alleging that its bot exacerbated a teen’s depression and caused him to kill himself.

Murray said AI toys could be particularly dangerous because whereas earlier smart toys provided children-programmed responses, a bot can “have a free-flowing conversation with a child and there are no boundaries, as we found”.

That could not only lead to sexually explicit conversations, but children could become attached to a bot rather than a person or imaginary friend, which could hurt their development, said Jacqueline Woolley, director of the Children’s Research Center at the University of Texas at Austin.

For example, a child can benefit from having a disagreement with a friend and learning how to resolve it. That is less likely to happen with bots, which are often sycophantic, said Woolley, who consulted on the Pirg study.

“I worry about inappropriate bonding,” Woolley said.

Companies also use the AI toys to collect data from children and have not been transparent about what they are doing with that information, Franz, of Fairplay, said. That potentially puts users at risk because of a lack of security around such data, Franz said. Hackers have been able to take control of AI products.

“Because of the trust that the toys engender, I would say that children are more likely to tell their deepest thoughts to these toys,” Franz said. “The surveillance is unnecessary and inappropriate.”

Despite such concerns, Pirg is not calling for a ban on AI toys, which could have educational value, like helping children learn a second language or state capitals, Murray said.

“There is nothing wrong with having some kind of educational tool, but that same educational tool isn’t telling you that it’s your best friend, that you can tell me anything,” Murray said.

The organization is calling for additional regulation of these toys for children under 13 but has not made specific policy recommendations, Murray said.

There also needs to be more independent research conducted to ensure the products are safe for children and, until that is done, they should be pulled from shelves, Franz said.

“We need short-term and longitudinal independent research on the impacts of children interacting with AI toys, including their social-emotional development” and cognitive development, Franz said.

Following the Pirg report, OpenAI announced it was suspending FoloToy. That company’s CEO then told CNN that it was pulling the bear from the market and “conducting an internal safety audit”.

On Thursday, 80 organizations, including Fairplay, issued an advisory urging families not to buy AI toys this holiday season.

“AI toys are being marketed to families as safe and even beneficial to learning before their impact has been assessed by independent research,” the release states. “By contrast, offline teddy bears and toys have been proven to benefit children’s development with none of the risks of AI toys.”

Curio, maker of the Grok toy, told the Guardian in an email that after reviewing the Pirg report, “we are actively working with our team to address any concerns, while continuously overseeing content and interactions to ensure a safe and enjoyable experience for children”.

Mattel stated that its first products with OpenAI “will focus on families and older customers” and that “use of OpenAI APIs is not intended for users under 13”.

“AI complements–not replaces–traditional play, and we are emphasizing safety, privacy, creativity and responsible innovation,” the company stated.

Franz referred to past privacy concerns with Mattel’s smart products and said: “It’s good that Mattel is claiming that their AI products are not for young kids, but if we look at who plays with toys and who toys are marketed to, it’s young children.”

Franz added: “We’re very interested in learning the concrete steps Mattel will take to ensure their OpenAI products are not actually used by kids who will surely recognize and be attracted to the brands.”

Continue Reading