Jack Clark, the cofounder of Anthropic, “a lot” of parents will want an AI teddy bear to help entertain their kids — himself included.
“I think most parents, if they could acquire a well-meaning friend that could provide occasional entertainment to their child when their child is being very trying, they would probably do it,” he said on an episode of the Conversations with Tyler podcast that posted last week.
AI tools for kids’ entertainment are already here — including a Grimes-backed stuffed rocket ship, which kids can chat with and ask questions to, and a storytelling bear that uses artificial intelligence to generate narratives.
While Clark wasn’t explicitly talking about those, he said he’d be supportive of toys with expanded capabilities — “smart AI friends” that could interact with children on the same level as someone in their age group.
“I am annoyed I can’t buy the teddy bear yet,” said Clark, who acted as policy director at OpenAI for 2 years before transitioning to Anthropic.
Clark said he doesn’t think he’s alone, either — as soon as children display a need to socialize, parents look for some way to get them to interact with their peers, he said. An AI companion could be an addition, rather than a substitute, he said.
“I think that once your lovable child starts to speak and display endless curiosity and a need to be satiated, you first think, ‘How can I get them hanging out with other human children as quickly as possible?'” he said, adding that he’s also placed his child on a preschool waitlist.
He’s especially wished for the help of an AI tool while doing chores, he added.
“I’ve had this thought, ‘Oh, I wish you could talk to your bunny occasionally so that the bunny would provide you some entertainment while I’m putting the dishes away, or making you dinner, or something,'” Clark said. “Often, you just need another person to be there to help you wrangle the child and keep them interested. I think lots of parents would do this.”
Not all tech leaders agree — Sam Altman, CEO of OpenAI and father as of February, says he doesn’t want his son’s best friend to be a bot.
“These AI systems will get to know you over the course of your life so well — that presents a new challenge and level of importance for how we think about privacy in the world of AI,” Altman said while testifying before the Senate last week.
A paper released by researchers at Microsoft and Carnegie Mellon University said AI being used “improperly” by knowledge workers could lead to the “deterioration of cognitive faculties” — and students are frequently using AI to “help” them with their assignments. But some research does show children can be taught, early on, to work alongside AI, rather than to depend on it entirely.
Clark is an advocate for measured exposure — he said removing a hypothetical AI friend from a kid’s life entirely could result in them developing an unhealthy relationship with the technology later on in life. If a child starts to show a preference for their AI companion over their human friends, it’s up to their parents to reorient them.
“I think that’s the part where you have them spend more time with their friends, but you keep the bunny in their life because the bunny is just going to get smarter and be more around them as they grow up,” he said. “If you take it away, they’ll probably do something really strange with smart AI friends in the future.”
Like any other technology that’s meant to provide entertainment, Clark said, it’s ultimately up to parents to regulate their child’s use.
“We do this today with TV, where if you’re traveling with us, like on a plane with us, or if you’re sick, you get to watch TV — the baby — and otherwise, you don’t, because from various perspectives, it seems like it’s not the most helpful thing,” he said. “You’ll probably need to find a way to gate this. It could be, ‘When mom and dad are doing chores to help you, you get the thing. When they’re not doing chores, the thing goes away.'”
Clark did not immediately respond to a request for comment from Business Insider.