- A lawsuit against Elon Musk's xAI alleges the Grok chatbot made sexualized images of three minors.
- The complaint accuses xAI of profiting from the "sexual predation of real people."
- Grok sparked massive backlash earlier this year after it was used to generate sexualized images.
A new lawsuit against Elon Musk's xAI alleges that its flagship chatbot, Grok, was used to create sexualized deepfake images of three minors — content the complaint says amounts to child sexual abuse material.
The proposed class action, filed Monday in a California federal court, accuses the AI startup of profiting from the "sexual predation of real people, including children."
"Nearly all the companies creating, marketing, and selling AI recognized the dangers of such a tool and chose to enact industry-standard guardrails that would prevent the use of their products by one extremely dangerous group: child sex predators. XAI did not," the lawsuit says.
Representatives for Musk and xAI did not immediately respond to a request for comment by Business Insider.
Musk previously said in a January post on X that he was "not aware of any naked underage images generated by Grok. Literally zero."
Grok generates images based on user prompts, he wrote.
"When asked to generate images, it will refuse to produce anything illegal, as the operating principle for Grok is to obey the laws of any given country or state," Musk said. "There may be times when adversarial hacking of Grok prompts does something unexpected. If that happens, we fix the bug immediately."
The lawsuit against xAI says that the Tennessee plaintiffs are "three of the minor victims of xAI's knowing production, possession, and distribution of AI-generated child sexual abuse material" depicting them.
The plaintiffs, identified in the court papers only as Jane Doe 1, Jane Doe 2, and Jane Doe 3, allege that xAI's AI tools were used to make nude images and videos of them. Jane Doe 1, a recent adult, was a minor at the time of the alleged incidents, while the two other plaintiffs are still minors, according to the lawsuit.
In December 2025, Jane Doe 1 received a message from an anonymous Instagram account warning her that "pics" of her had been generated by someone she knew and spread across the group-chat platform Discord.
"Through a series of messages, the anonymous user went on to explain that the perpetrator had uploaded a folder of image and video files depicting her and other minor females to Discord," the lawsuit says.
The anonymous user eventually sent Jane Doe 1 several sexualized AI-generated images and videos of her and other minor girls, according to the lawsuit.
"At least five of these files, one video and four images, depicted her actual face and body in settings with which she was familiar, but morphed into sexually explicit poses," the lawsuit says. "The images showed her entire body, including her genitals, without any clothes. The video depicted her undressing until she was entirely nude."
Jane Doe 1 alerted the other minors in the images and their families, and a criminal investigation was opened in Tennessee, according to the lawsuit. It added that local police arrested a suspect in connection with the case in December 2025.
Last month, Jane Doe 2 learned through the investigation that at least two of her images had also been used to produce sexually explicit AI-generated content using xAI's tools, the lawsuit says.
Local law enforcement told the girl and her mother that one image taken of the girl on the beach in a blue bikini had been "morphed to depict her without any clothes," the lawsuit says.
Authorities also informed Jane Doe 3 that the AI-generated images recovered from the suspect's phone included one that had been "morphed to depict her fully nude," according to the lawsuit.
Vanessa Baehr-Jones, an attorney for the plaintiffs, told Business Insider that her clients have endured a "nightmare."
"Their CSAM images and videos depicting them when they were minors are now forever out there on the internet in these dark net worlds of child sex predators," Baehr-Jones said. "The harm from that is acute."
The attorney said she hopes the lawsuit brings "accountability" for xAI and that "most importantly, this should never happen to any other victim."
The lawsuit seeks unspecified damages and accuses xAI of production with the intent to distribute child pornography, distribution of child pornography, and possession of child pornography, among other claims.
"In our legal system, money is the way we make corporations pay for the harms that they have caused," Annika Martin, another lawyer for the plaintiffs, said. "Because corporations are profit-seeking entities, hitting them in the wallet is the only way to influence their decision-making."
Earlier this year, xAI's Grok sparked massive backlash after the AI image generator was used to make nonconsensual sexualized images of real people.
In response, X, the social media site that Musk sold to xAI in March, said Grok would no longer be able to generate AI images of real people in sexualized or revealing clothing.
Ashley St. Clair, who gave birth to one of Elon Musk's sons in 2024, sued xAI in January, alleging that Grok generated sexually explicit deepfakes of her at users' request.
French authorities are also investigating Grok over sexualized deepfakes.
The post Lawsuit against Elon Musk's xAI alleges Grok created sexualized deepfakes of 3 minors appeared first on Business Insider











































































