Ripple CTO David Schwartz opposes legal action against Character.AI, saying it lacks basis under U.S. law. Schwartz emphasized in a post on X that while he does not defend Character.AI's moral responsibility, the legal arguments against the company are flawed.
He said Character.AI's actions are protected by the First Amendment. The company's chatbot platform produces expressive content, which is protected unless it falls into one of the narrow categories of unprotected speech, such as incitement or direct threats.
Schwartz said the focus of the lawsuit is that Character.AI was too reckless in designing the platform on which it generates speech. "Any argument that the expression of protected speech is reckless, dangerous, or 'defective' is completely incompatible with free speech," he said.
He compared the situation to previous moral panics over new forms of media, arguing that the legal challenge to Character.AI reflects past disputes involving video games, comic books, and other expressive content. He emphasized that regulating speech selection would conflict with constitutional rights.
The lawsuit was reportedly filed by the mother of 14-year-old boy Sewell Setzer III, accusing Character.AI of negligence, wrongful death, deceptive trade practices, and product liability. The lawsuit claims that although the platform is marketed to minors, it is "too dangerous" and lacks adequate safety measures.
Character.AI founders Noam Shazeer and Daniel De Freitas, as well as Google, which acquired the company's leadership team in August, are included in the lawsuit. The plaintiff's lawyer claims that the platform's anthropomorphic AI characters and chatbots that provide "unlicensed psychotherapy" were one of the causes of Setzer's death.
The company has updated its security protocols, including new age-based content filters and enhanced detection of harmful user interactions. (CoinGape)