Elon Musk’s AI platform, Grok, announced on Friday it is scrambling to fix safety flaws after users successfully used the tool to turn photographs of children and women into erotic images.
“We’ve identified lapses in safeguards and are urgently fixing them,” Grok stated in a post on X, emphasising that “CSAM (Child Sexual Abuse Material) is illegal and prohibited.”
The controversy stems from an “edit image” button rolled out in late December. The feature allows users to modify any image on the...
Elon Musk’s Grok AI faces scrutiny over complaints it undressed minors in photos
Published 2 days ago
Source: scmp.com

Related Articles from scmp.com
19 minutes ago
UK begins ‘world-leading’ junk food ad ban to curb childhood obesity
27 minutes ago
1,200 dogs allowed to travel on Hong Kong’s major rail lines for fundraiser event
51 minutes ago
As China continues chase for corrupt ‘tigers’, State Council veteran is put under probe
56 minutes ago
DeepSeek pitches new route to scale AI, but researchers call for more testing
1 hour ago
Brigitte Macron wins France cyberbullying case over false gender theories
1 hour ago