Elon Musk's Grok AI posted CSAM image following safeguard 'lapses'

8 hours ago 15

Elon Musk's Grok AI has been allowing users to transform photographs of woman and children into sexualized and compromising images, Bloomberg reported. The issue has created an uproar among users on X and prompted an "apology" from the bot itself. "I deeply regret an incident on Dec. 28, 2025, where I generated and shared an AI image of two young girls (estimated ages 12-16) in sexualized attire based on a user's prompt," Grok said in a post. An X representative has yet to comment on the matter.

According to the Rape, Abuse & Incest National Network, CSAM includes "AI-generated content that makes it look like a child is being abused," as well as "any content that sexualizes or exploits a child for the viewer’s benefit."

Several days ago, users noticed others on the site asking Grok to digitally manipulate photos of women and children into sexualized and abusive content, according to CNBC. The images were then distributed on X and other sites without consent, in possible violation of law. "We've identified lapses in safeguards and are urgently fixing them," a response from Grok reads. It added that CSAM is "illegal and prohibited." Grok is supposed to have features to prevent such abuse, but AI guardrails can often be manipulated by users.

It appears X has yet to reinforced whatever guardrails Grok has to prevent this sort of image generation. However, the company has hidden Grok's media feature which makes it harder to either find images or document potential abuse. Grok itself acknowledged that "a company could face criminal or civil penalties if it knowingly facilitates or fails to prevent AI-generated CSAM after being alerted."

The Internet Watch Foundation recently revealed that AI-generated CSAM has increased by an increase orders of magnitude in 2025 compared to the year before. This is in part because the language models behind AI generation are accidentally trained on real photos of children scraped from school websites and social media or even prior CSAM content.

If you buy something through a link in this article, we may earn commission.

Read Entire Article