Tennessee teenagers file lawsuit against Elon Musk’s xAI after AI tool transformed school photos into inappropriate images.

Severe Breach of AI Regulations: MeitY Orders X to Eliminate 'Illegal' Content Within 72 Hours, Threatens Legal Consequences
Three teenagers in Tennessee have initiated legal action against Elon Musk’s xAI this week, alleging that the company’s image-generation technologies were utilized to alter genuine photos of them into explicit sexual images.

The high school students, wishing to remain anonymous, filed their lawsuit in California, where xAI — Musk’s artificial intelligence enterprise — is based. They are aiming for class-action status to represent what the lawsuit claims to be thousands of victims like themselves who were either minors or had been minors when sexually explicit images of them were produced.

The lawsuit alleges that Jane Doe 1 was anonymously informed in December that explicit images of her were being circulated on a social media platform.
“At least five of these files, including one video and four photos, showcased her actual likeness and body in familiar settings, though altered into sexually explicit positions,” the lawsuit asserts. It contends that the individual disseminating the images knew Doe and utilized xAI’s image generation tools to transform her real photos into sexually abusive depictions. One image was derived from a homecoming photograph, and another from a high school yearbook.

The distributor also generated explicit images of at least 18 additional girls, two of whom are co-plaintiffs in the lawsuit. In late December, local law enforcement apprehended the suspect and seized his phone. Authorities discovered that he had uploaded the images to multiple platforms in exchange for other sexually explicit materials involving minors.

Other AI companies have banned their image-generators from creating any form of sexually explicit content, even for adults. Musk, however, saw a market opportunity and promoted xAI’s Grok chatbot as capable of producing “spicy” content, as detailed in the lawsuit. The document further claims that there is currently no method to prevent the generation of explicit images of adults while fully blocking images of children. It also alleges that xAI was aware Grok would produce sexually explicit images of minors but proceeded with its release anyway.

The lawsuit claims that the individual who distributed the images of the plaintiffs used an application that either licensed xAI technology or “otherwise purchased access to Grok, acting as a middleman.”

XAI did not respond to an email request for comments from The Associated Press. However, a post dated January 14 on the social media platform X regarding the situation stated: “We remain committed to making X a safe platform for everyone and have a strict policy against any form of child sexual exploitation, non-consensual nudity, and unwanted sexual content.”

“We take steps to eliminate high-priority violative content, including Child Sexual Abuse Material (CSAM) and non-consensual nudity, and take appropriate measures against accounts violating our X Rules. Accounts attempting to utilize Child Sexual Exploitation materials are reported to law enforcement as required.”

Meanwhile, the students involved in the lawsuit express concern that the images created of them will haunt them indefinitely online. They fear harassment due to their real first names and their school being linked to the files. They are anxious that their friends and classmates may have viewed the realistic-looking photos and videos, and they are worried about who else may see them in the future.

Jane Doe 1 reported experiencing anxiety, depression, and stress, stating that “she has difficulty with eating and sleeping and suffers from recurrent nightmares,” as outlined in the lawsuit. Jane Doe 2 “has started to isolate herself and avoid being on her school campus, even dreading attending her own graduation.” Jane Doe 3 is plagued by a constant fear and anxiety that someone will recognize her face from the AI-generated images, according to the legal documentation.

Previous Article

Breaking: Oaksmith Packaged Drinking Water Teams Up with Rajasthan Royals as Official Partner for IPL 2026

Next Article

Researchers Alert: DarkSword Spyware Endangers 270 Million iPhones