ARTICLE AD BOX
Kali HaysTechnology reporter

Reuters
Elon Musk's artificial intelligence company is facing a lawsuit from teenagers who say the company facilitated child pornography by allowing the creation of sexually explicit images of them.
The lawsuit against xAI was filed Monday in a federal California court by three young women whose images and videos were altered by a Grok user without their knowledge to show them nude or in otherwise overtly sexual ways.
Grok is a chatbot developed by xAI and hosted on Musk's social media platform X. xAI did not respond to a request for comment made via its parent company.
The legal action is part of the fallout since last year's controversial release of new Grok features that X called "spicy" mode.
Lawyers for the young women said Grok's ability to alter images and video had been created and released by xAI solely to drive use of the chatbot and X.
They likened the way images of the young women were changed to "a rag doll brought to life through the dark arts".
"xAI—and its founder Elon Musk— saw a business opportunity," the complaint says. "They knew Grok could produce such results, including by using the images and videos of children, and publicly released it anyway."
The young women are seeking unspecified damages, as well as an immediate order barring Grok from creating such images.
"Their lives have been shattered by the devastating loss of privacy, dignity, and personal safety", lawyers for the young women said in their complaint.
Two of the teenagers behind the lawsuit are under the age of 18, but all three are withholding their names from the public in order to protect their privacy.
One of the young plaintiffs said she found out about the imagery after she received an anonymous message on Instagram, alerting her about images and videos of her, including her high school yearbook photo, which had been altered to show sexually explicit actions and full nudity.
The material was being shared on a Discord server, a private chat space on that platform, and included similar imagery, also altered using Grok, of at least 18 other young women who were also minors, according to the complaint.
The other two women who are suing xAI also found fake sexually explicit imagery of them online, which was found to have been created via Grok.
Grok was launched in 2023 by Musk's xAI. The company, along with social media firm X, is now part of Musk's SpaceX company, which took over xAI last month.
Last year, xAI released what it called Grok Imagine or "spicy mode" for Grok, with features that allowed users to prompt it to create fake images that were more sexual in nature.
In less than two weeks, Grok had created millions of sexualized images, including more than 20,000 of children, according to a sampling of the images conducted by the Center for Countering Digital Hate.
Musk initially downplayed Grok's ability to create fake sexualized content, saying in January he was "not aware of any naked underage images generated by Grok. Literally zero," and putting the blame on users of the feature.
"Obviously, Grok does not spontaneously generate images, it does so only according to user requests", Musk wrote on X.
As such abuse of online imagery continued this year, however, UK watchdog Ofcom, the European Commission and California each launched investigations into the feature's ability to create sexualized images of real people, particularly children.
By mid-January, X said that it would implement "technological measures" to stop Grok's ability to undress people in photos.
Eventually, the perpetrator behind the Discord server mentioned in the new lawsuit was arrested. He was not named in the lawsuit but is part of a separate police investigation.
According to the lawsuit, that investigation has discovered that he had hundreds of AI-generated and altered sexual abuse images of minors that he traded on the messaging platform Telegram and on the file-sharing platform Mega, according to the lawsuit.

2 hours ago
7








English (US) ·