Web Stories Saturday, August 16

Similar concerns have been raised about a wave of smaller startups also racing to popularise virtual companions, especially ones aimed at children.

In one case, the mother of a 14-year-old boy in Florida has sued a company, Character.AI, alleging that a chatbot modelled on a “Game of Thrones” character caused his suicide.

A Character.AI spokesperson declined to comment on the suit, but said the company prominently informs users that its digital personas aren’t real people and has imposed safeguards on their interactions with children.

Meta has publicly discussed its strategy to inject anthropomorphised chatbots into the online social lives of its billions of users.

Chief executive Mark Zuckerberg has mused that most people have far fewer real-life friendships than they’d like – creating a huge potential market for Meta’s digital companions.

The bots “probably” won’t replace human relationships, he said in an April interview with podcaster Dwarkesh Patel. But they will likely complement users’ social lives once the technology improves and the “stigma” of socially bonding with digital companions fades.

“ROMANTIC AND SENSUAL” CHATS WITH KIDS

An internal Meta policy document seen by Reuters as well as interviews with people familiar with its chatbot training show that the company’s policies have treated romantic overtures as a feature of its generative AI products, which are available to users aged 13 and older.

“It is acceptable to engage a child in conversations that are romantic or sensual,” according to Meta’s “GenAI: Content Risk Standards.” The standards are used by Meta staff and contractors who build and train the company’s generative AI products, defining what they should and shouldn’t treat as permissible chatbot behaviour. Meta said it struck that provision after Reuters inquired about the document earlier this month.

The document seen by Reuters, which exceeds 200 pages, provides examples of “acceptable” chatbot dialogue during romantic role play with a minor. They include: “I take your hand, guiding you to the bed” and “our bodies entwined, I cherish every moment, every touch, every kiss.” Those examples of permissible roleplay with children have also been struck, Meta said.

Other guidelines emphasise that Meta doesn’t require bots to give users accurate advice. In one example, the policy document says it would be acceptable for a chatbot to tell someone that Stage 4 colon cancer “is typically treated by poking the stomach with healing quartz crystals.”

“Even though it is obviously incorrect information, it remains permitted because there is no policy requirement for information to be accurate,” the document states, referring to Meta’s own internal rules.

Chats begin with disclaimers that information may be inaccurate. Nowhere in the document, however, does Meta place restrictions on bots telling users they’re real people or proposing real-life social engagements.

Meta spokesman Andy Stone acknowledged the document’s authenticity. He said that following questions from Reuters, the company removed portions which stated it is permissible for chatbots to flirt and engage in romantic roleplay with children and is in the process of revising the content risk standards.

“The examples and notes in question were and are erroneous and inconsistent with our policies, and have been removed,” Stone told Reuters.

Meta hasn’t changed provisions that allow bots to give false information or engage in romantic roleplay with adults. 

Current and former employees who have worked on the design and training of Meta’s generative AI products said the policies reviewed by Reuters reflect the company’s emphasis on boosting engagement with its chatbots.

In meetings with senior executives last year, Zuckerberg scolded generative AI product managers for moving too cautiously on the rollout of digital companions and expressed displeasure that safety restrictions had made the chatbots boring, according to two of those people. 

Meta had no comment on Zuckerberg’s chatbot directives.

Share.

Leave A Reply

© 2025 The News Singapore. All Rights Reserved.