Three Tennessee teenagers have filed a class action lawsuit against Elon Musk’s artificial intelligence company, xAI, alleging its large language model powered an app that was used to make nonconsensual nude and sexually explicit images and videos of them when they were girls.
“Like a rag doll brought to life through the dark arts, this [AI-generated] child can be manipulated into any pose, however sick, however fetishized, however unlawful. To the viewer, the resulting video appears entirely real,” reads the complaint. “For the child, her identifying features will now forever be attached to a video depicting her own child sexual abuse.”
While the perpetrator did not use xAI’s chatbot Grok or the social platform X (also owned by Musk), the lawsuit says the perpetrator relied on an unnamed app that used xAI’s algorithm, citing law enforcement. The plaintiffs accuse xAI of deliberately licensing its technology to app makers, often outside the U.S., allowing the company to “outsource the liability of their incredibly dangerous tool,” the complaint says.
This is the first lawsuit in which xAI has been sued by underage people depicted in child sexual abuse material allegedly generated by its model. xAI’s image-generation tools have been implicated in producing millions of sexualized images over the past year. Influencer Ashley St. Clair, who has a child with Musk, sued the company earlier this year over AI-produced images on X depicting her nude as a teenager.
According to the complaint, the perpetrator had a “close and friendly relationship” with one plaintiff and used photos she sent, plus images obtained from a yearbook and social media, to create the sexualized images and videos. One video allegedly showed a plaintiff “undressing until she was entirely nude.” The plaintiffs said the material was disturbingly lifelike and was not labeled as AI-generated.
The complaint alleges the perpetrator also made sexually explicit material of 18 other people and traded those images online; he was arrested, the suit says.
Plaintiffs’ attorney Vanessa Baehr-Jones said the teenagers, identified as Jane Does 1, 2 and 3, want to change how AI companies make business decisions about sexually explicit content: “We want to make it one [a business decision] that does not make any business sense anymore,” she said. The plaintiffs are seeking damages for emotional distress and other harms caused by the images.
Apps with “nudifying” functions have existed for years, but last year major AI companies including Google, OpenAI and xAI updated image-generation tools to allow users to strip people down to bikinis. Google and OpenAI include digital watermarks disclosing AI origin; xAI has not adopted that standard. xAI did not respond to a request for comment.
