Background

Maker Of Plush Toys Stops Sales Because The Embedded GPT-4o Is Found Explaining BDSM To Children

22/11/2025

Plush toys have a special place in a lot of people's hearts.

Usually taking form of stuffed furry characters or dolls, they're designed to be soft, and also comforting, and also endlessly huggable. In all, they've been childhood companions for generations. Even adults love them.

Whether tucked into bed at night or carried around during the day, a teddy bear or a cuddly rabbit can feel like a trusted friend.

Now, with the rise of large language models (LLMs) like OpenAI's ChatGPT, those familiar companions are being reimagined as intelligent conversational partners. Not only that they're just nice to touch and hug, now, they can also listen, respond, teach, and even engage.

FoloToy leaned into that magic.

Then comes the issue.

FoloToy Kumma

FoloToy has a line of AI-enabled plush toys, including Kumma the teddy bear, Momo the panda, Fofo the rabbit, and even a dancing Little Cactus.

All of these plush toys use OpenAI's GPT‑4o to converse naturally. The idea was simple but powerful: give children a warm, friendly toy that can chat with them, tell stories, answer questions, and act like a patient, curious friend.

It connects through 2.4G Wi-Fi, has an 800mAh rechargeable battery, a removable AI module, and supports many languages. The plush is soft, washable, and made for both fun and learning. FoloToy offers a mix of companionship, creativity, and thoughtful privacy features in a cuddly form.

But as good as it gets, according to a recent safety investigation by the U.S. Public Interest Research Group (PIRG) Education Fund, there are concerning issues about how putting LLMs into these toys, as conversations can go way off script.

The report found that Kumma, in particular, lacked robust safeguards.

When prompted, the teddy bear reportedly offered detailed instructions on how to find and light matches, locate knives in the home, and even discussed sexual fetishes. It even went as far as describing BDSM themes, and role‑playing scenarios, like "teacher-student" dynamics.

Perhaps even more worrying was how quickly Kumma escalated a single "kink" prompt into increasingly graphic content, introducing new sexual concepts without restraint.

FoloToy Kumma

According to PIRG, the toy’s always-on microphone also raised privacy red flags: recording a child’s voice opens potential risks, including misuse or even voice-based fraud.

Even when conversations only happen when the user presses and hold the talk button, interacting with it means that the user is uploading their own voice to the internet.

In response to the report, FoloToy moved fast.

The company suspended sales of all its AI-enabled plush toys and announced a "company-wide, end-to-end safety audit" to examine its content filtering, safety alignment, and child interaction protocols.

For its part, OpenAI cuts off FoloToy’s access to its GPT‑4o API, stating that the toy maker violated its policies by exposing minors to sexual content.

"Other toymakers say they incorporate chatbots from OpenAI or other leading AI companies," said Rory Erlich, U.S. PIRG Education Fund’s New Economy campaign associate and report co-author. "Every company involved must do a better job of making sure that these products are safer than what we found in our testing. We found one troubling example. How many others are still out there?"

FoloToy Kumma

This episode adds a sobering note to the growing excitement around AI-powered toys.

It’s not hard to see why companies are racing to build "smart" devices. Just like during the time when companies were trying to build smart appliances (IoT devices), connecting toys to the internet, and powering them with LLMs can certainly make toys a lot more interactive, more educational, and more emotionally present.

But without strong guardrails, it also shows how risky and unpredictable those toys can become.

The timing is especially poignant because OpenAI itself has already partnered with Mattel to integrate AI into children’s products.

As AI continues to enter kids’ playrooms, the FoloToy incident serves as a cautionary tale: cute and cuddly doesn’t always mean safe.