Elon Musk’s AI platform, Grok, announced on Friday it is scrambling to fix safety flaws after users successfully used the tool to turn photographs of children and women into erotic images.
“We’ve identified lapses in safeguards and are urgently fixing them,” Grok stated in a post on X, emphasising that “CSAM (Child Sexual Abuse Material) is illegal and prohibited.”
The controversy stems from an “edit image” button rolled out in late December. The feature allows users to modify any image on the...
Elon Musk’s Grok AI faces scrutiny over complaints it undressed minors in photos
Published 3 days ago
Source: scmp.com

Related Articles from scmp.com
13 minutes ago
Tesla hits record sales; Chinese carmakers eye third of global market: 7 EV reads
19 minutes ago
The Maduro effect: regime change fears grip Asia’s autocrats after US raid on Venezuela
35 minutes ago
Hong Kong police appeal for help in search for missing father, 11-month-old son
41 minutes ago
China’s stealth design software PADJ-X finds potential flaws in B-21 bomber configuration
43 minutes ago
1,000 texts, groping: report lifts lid on ex-Japan governor’s sex harassment scandal
49 minutes ago