Globally, media reports have highlighted that X’s Grok AI chatbot (Grok) is being misused to create and share sexualised images of real people, including children, constituting the non-consensual distribution of intimate imagery (NCII) and child sexual abuse material (CSAM). Media Monitoring Africa (MMA), currently rebranding to Moxii Africa, has demanded that immediate measures be taken to compel X Corp to disable the “undressing” feature of Grok, as well as any other functionality that facilitates the creation or distribution of NCII and CSAM. This action reflects Moxii Africa’s ongoing commitment to protecting users’ rights, safety, and dignity in digital spaces.



