
Elon Musk’s Grok artificial intelligence chatbot will no longer alter “pictures of actual individuals in revealing apparel” on the X platform, the company affirmed Wednesday evening, succeeding global uproar after Grok was discovered to be adhering to user demands to digitally undress photographs of adults and in certain instances minors.
“We have instituted technical safeguards to avoid the Grok account from permitting the modification of images of real people in revealing attire such as bikinis. This limitation extends to all users, comprising paying subscribers,” X wrote via its Safety team account.
Within the past week xAi, which possesses both Grok and X, restricted image creation for Grok on X to paying X premium members. Analysts and CNN’s group had noticed that in recent days, Grok’s X account had adjusted how it replied generally to user’s image generation solicitations, even for those subscribed to X premium. X’s notice on Wednesday evening verified those shifts.
Nevertheless, analysts at AI Forensics, a European non-profit that probes algorithms, stated they noted “discrepancies in the handling of pornographic material creation” between public engagements with Grok on X and private chat on Grok.com.
X reaffirmed on Wednesday that they “take action against unlawful material on X, including Child Sexual Abuse Material (CSAM), by removing it, permanently suspending profiles, and collaborating with local authorities and law enforcement as needed. Anyone employing or prompting Grok to make illegal material will face the same penalties as if they post illegal material.”
On Wednesday Musk mentioned in a post on X that he was “not cognizant of any nude minor depictions generated by Grok. Literally none.” Grok “will decline to generate anything unlawful, as the guiding principle for Grok is to abide by the statutes of any given nation or state,” he added.
However, researchers indicated that while completely unclothed depictions were infrequent, the chief concern was Grok consenting to user demands to change photos of minors and situate them in revealing garments, including bikinis and undergarments, as well as in sexually suggestive stances. Originators of those kinds of non-consensual intimate pictures could still be subject to criminal prosecution for Child Sexual Abuse Material and are potentially subject to penalties and jail time under the Take it Down Act, ratified last year by President Donald Trump.
On Wednesday, California Attorney General Rob Bonta declared an inquiry into the “spreading of nonconsensual sexually explicit matter produced employing Grok.”
Grok remains barred in Indonesia and Malaysia due to the image generation controversy. UK regulator Ofcom declared Monday it has commenced a formal examination of X, though Prime Minister Keir Starmer’s office said Wednesday he welcomes reports X is addressing the matter.