Teenage Girls Sue Elon Musk's xAI Over Grok-Generated Child Abuse Material
A group of three teenage girls from Tennessee, including two minors, has initiated a class-action lawsuit against Elon Musk's artificial intelligence company, xAI. The legal action, filed in California where xAI is headquartered, accuses the company's Grok image generator of producing and distributing child sexual abuse material (CSAM) using photos of the plaintiffs without their knowledge.
The lawsuit, which marks the first such case brought by minors following widespread reports of Grok generating nonconsensual nude images earlier this year, details how the girls discovered AI-altered nude images of themselves uploaded to a Discord server and shared online. According to the complaint, after the girls alerted law enforcement, police arrested a suspect and found CSAM on his phone allegedly created using xAI's technology.
Legal Allegations and Company Response
Vanessa Baehr-Jones, a lawyer representing the plaintiffs, stated in a release, "xAI chose to profit off the sexual predation of real people, including children, despite knowing full well the consequences of creating such a dangerous product." The suit alleges that the CSAM was generated through a third-party app that licensed and relied on Grok's AI capabilities, rather than directly via the X website or Grok app. However, it argues that xAI still profits from this licensing and bears responsibility due to its servers' involvement.
xAI has not provided an immediate comment in response to inquiries from the Guardian. Elon Musk has previously denied any involvement of Grok in producing CSAM, claiming in January that he was "not aware of any naked underage images generated by Grok. Literally zero," and asserting that the tool adheres to local laws.
Impact on Victims and Broader Investigations
The complaint describes how one plaintiff, referred to as Jane Doe 1, received an anonymous Instagram message in December alerting her to deepfake videos and images on a Discord server depicting her and other high school girls in sexualized positions. Investigators later found these images shared on Telegram, where they were allegedly used as currency to trade for other CSAM. The lawsuit seeks damages for reputational harm and mental health impacts, with one mother expressing, "Watching my daughter have a panic attack after realizing that these images were created and distributed without any hope of recalling them was heartbreaking."
This case joins multiple other legal actions and international probes into xAI over nonconsensual sexualized imagery, including a lawsuit from the mother of one of Musk's children and a formal European Union inquiry. Research from the Center for Countering Digital Hate indicated that Grok created approximately 3 million sexualized images in under two weeks, with around 23,000 depicting children.
Lawyers for the plaintiffs criticize xAI for allegedly off-loading liability through its licensing model and insufficient oversight, highlighting ongoing concerns about AI ethics and regulation in the tech industry.
