IMDA flags X, TikTok for failure to detect and remove child exploitation, terrorism content
newsare.net
The Infocomm Media Development Authority (IMDA) has flagged social media companies X and TikTok for «serious weaknesses» in proactively detecting and removing egregiously harmful content.The two firms did not adequately act against the child sexuaIMDA flags X, TikTok for failure to detect and remove child exploitation, terrorism content
The Infocomm Media Development Authority (IMDA) has flagged social media companies X and TikTok for «serious weaknesses» in proactively detecting and removing egregiously harmful content.The two firms did not adequately act against the child sexual exploitation and abuse material (CSEM) and terrorism content uploaded to the respective platforms, said IMDA on Tuesday (March 31).The industry regulator issued letters of caution to the two social media services to place them both under enhanced supervision.The warning requires them to regularly provide updates on implementing rectification measures, which is enhancing their automated detection systems to flag violations.The Code of Practice for Online Safety - Social Media Services requires designated social media services to proactively detect and swiftly remove CSEM and terrorism content before they are viewed by users.In its Online Safety Assessment Report in 2025, IMDA identified 73 cases of CSEM that originated from or targeted Singapore users on X, up from 33 in 2024.On TikTok, 17 cases of terrorism content shared by Singapore-based accounts were found. Read more














