MDDI's PQ Response to Government's Assessment of Social Media's Efforts to Detect Child Sexual Exploitation and Abuse in Livestreams
26 February 2026
Parliament Sitting on 26 February 2026
Question for Oral Answer
*31 Ms He Ting Ru asked the Minister for Digital Development and Information (a) what is IMDA’s assessment of social media platforms’ efforts and capabilities to detect child sexual exploitation and abuse in livestreams, which are known to be more difficult to detect than in uploaded material; and (b) what measures are being taken currently to address this risk.
Answer
The Government has put in place measures to address child sexual exploitation and abuse online. Under the Broadcasting Act and Online Criminal Harms Act, the Government can issue directions to social media services or internet service providers to disable Singapore users’ access to child sexual exploitation and abuse material published online.
Under the Code of Practice for Online Safety – Social Media Services, designated Social Media Services are required to minimise Singapore users’ exposure to child sexual exploitation and abuse material on their services, including livestreams. The designated two services must put in place technologies and processes to proactively detect and swiftly remove such material. They must also take steps to protect users from child exploitation and abuse activity, such as online grooming.
The Code additionally requires designated services to submit annual online safety reports to IMDA, which detail their measures to combat harmful content, including child sexual exploitation and abuse material, and improve users’ safety. The reports are published on IMDA’s website, alongside IMDA’s assessment on each designated service’s performance.
*Converted to written answer
