Judge not ready to rule if AI outputs are speech
Google and Character Technologies also moved to dismiss the lawsuit based on First Amendment claims, arguing that C.AI users have a right to listen to chatbot outputs as supposed “speech.”
Conway agreed that Character Technologies can assert the First Amendment rights of its users in this case, but “the Court is not prepared to hold that the Character.AI LLM’s output is speech at this stage.”
C.AI had tried to argue that chatbot outputs should be protected like speech from video game characters, but Conway said that argument was not meaningfully advanced. Garcia’s team had pushed back, noting that video game characters’ dialog is written by humans, while chatbot outputs are simply the result of an LLM predicting what word should come next.
“Defendants fail to articulate why words strung together by an LLM are speech,” Conway wrote.
As the case advances, Character Technologies will have a chance to beef up the First Amendment claims, perhaps by better explaining how chatbot outputs are similar to other cases involving non-human speakers.
C.AI’s spokesperson provided a statement to Ars, suggesting that Conway seems confused.
“It’s long been true that the law takes time to adapt to new technology, and AI is no different,” C.AI’s spokesperson said. “In today’s order, the court made clear that it was not ready to rule on all of Character.AI’s arguments at this stage and we look forward to continuing to defend the merits of the case.”
C.AI also noted that it now provides a “separate version” of its LLM “for under-18 users,” along with “parental insights, filtered Characters, time spent notification, updated prominent disclaimers, and more.”
“Additionally, we have a number of technical protections aimed at detecting and preventing conversations about self-harm on the platform; in certain cases, that includes surfacing a specific pop-up directing users to the National Suicide and Crisis,” C.AI’s spokesperson said.
If you or someone you know is feeling suicidal or in distress, please call the Suicide Prevention Lifeline number, 1-800-273-TALK (8255), which will put you in touch with a local crisis center.