Second Reading Opening Speech By MOS Rahayu Mahzam On The Online Safety (Relief And Accountability) Bill
5 November 2025
Introduction
Earlier, Minister Josephine Teo spoke about the genesis, the intent of the OSRA Bill, and the need for more to be done to improve online safety.
I will take Members through the Bill. Let me first give an overview of the structure of the Bill. The Bill is split into 15 Parts.
Part 1 consists of standard provisions, such as the purpose, and interpretation sections.
Part 2 establishes the Office of the Commissioner of Online Safety (the Commissioner).
Part 3 defines the categories of harm that will fall under the scope of the Bill.
Part 4 governs the statutory reporting mechanism, and
Part 5, the types of directions that may be issued by the Commissioner.
Part 6 covers the investigative powers of the Commissioner.
Part 7 governs the oversight mechanisms that the Commissioner will be subject to.
Part 8, on offences and enforcement.
Part 9 consists of provisions relating to service of documents, and powers to make regulations.
Parts 10 to 13 establish the statutory torts and the damages that may be sought under the torts, which Minister Edwin Tong will speak on later.
Part 14 deals with jurisdictional, procedural and miscellaneous matters, and
Part 15 covers amendments made to other Acts.
1. In my speech, I will be focusing on Parts 2 to 8 of the Bill and will explain how the Bill will help to define what behaviours are unacceptable in the online world.
Types of Harms Covered by OSRA
Part 3 of the Bill will specify 13 categories of harm, with each category defined in legislation, to create a common understanding of the scope and definition of each type of harm. These definitions were developed in consultation with the public, platforms, and were guided by a Steering Committee.
The Steering Committee was chaired by then-Minister for Law and Home Affairs, Minister K Shanmugam and made up of Government agencies and experienced industry members.
Let me now explain each harm in turn.
First, online harassment, including sexual harassment. Online harassment is defined as the communication of online material that a reasonable person would conclude is threatening, abusive, insulting, sexual or indecent, and is likely to cause a person harassment, alarm, distress or humiliation.
Second, doxxing, which will cover the publication of a person’s identity information that a reasonable person would conclude is likely to have been intended to cause harassment, alarm, distress or humiliation.
Third, online stalking, which will be defined as engaging in a course of online conduct that involves online acts or omissions associated with stalking, and that a reasonable person would conclude is likely to cause harassment, alarm, distress or humiliation to the other person.
Fourth, intimate image abuse, or the non-consensual sharing of intimate images, a harm that can result in a severe and lasting impact on victims.
This is defined as the communication of online material that contains an intimate image or recording of a person, without their consent, that a reasonable person would conclude is likely to cause that person harassment, alarm, distress or humiliation.
This includes images or recordings that may have been altered or generated by artificial intelligence (AI) or any other means.
An offer to sell or distribute, and advertisements of an intimate image or recording are also included.
Fifth, image-based child abuse. Like intimate image abuse, this will include images or recordings that may have been altered or generated by AI or other means, and will cover depictions of both physical and sexual child abuse.
Sixth, inauthentic material abuse. With the rapid development in AI technology, and the advent of more powerful and easily accessible generative AI models, we have seen more deepfakes being used to cause harm to others.
This harm is defined as the communication of inauthentic material that a reasonable person would conclude is likely to cause harassment, alarm, distress or humiliation because it is false or misleading.
The material is regarded to be inauthentic where it is an image, video, or soundbite, that has been manipulated or generated through digital means to create a false or misleading depiction of the victim’s words, actions or conduct, and is realistic enough such that a reasonable person would believe that the victim did say such words or engaged in such actions or conduct.
This category includes not only express depictions, but also implied depictions.
Seventh, online impersonation, which is defined as online activity where one person pretends to be another without their consent, and would lead a reasonable person to believe that the activity is conducted by the second person.
However, parody, satire or commentary which no reasonable person would believe is made by the victim, is not online impersonation.
The eighth category is online instigation of disproportionate harm. This harm seeks to address the issue of ‘mob behaviour’ or cancel campaigns, where one individual calls for numerous others to pile on to another person, in response to the person’s actions or speech, instigating disproportionate harm.
This category of harm will not be applicable to statements that tend to rally or call the public to undertake core political activities, such as calling for others to vote for a candidate in a Presidential or Parliamentary election.
The ninth category is the non-consensual disclosure of private information. This harm is defined as the publication of any private information of a person without consent, and that a reasonable person would conclude was likely to cause that person harassment, alarm, distress or humiliation.
The types of information that would be deemed as private would be context specific, and would depend on whether the information is already in the public domain.
Whether the individual is of the view that the information is sensitive may also be taken into account by the Commissioner.
The Minister for Digital Development and Information will also be able to prescribe certain types of information as “private information”.
The tenth and eleventh categories are the incitement of enmity, and the incitement of violence.
Enmity refers to feelings of enmity, hatred, or hostility against any group.
Violence refers to unlawful force or unlawful violence.
These harms are what we term “group harms”, as they aim to tackle harmful online content that is likely to harm groups in Singapore.
A group will be defined as a group of persons of any description, and may include, for example, a group distinguished by race or religion.
Lastly, the twelfth and thirteenth categories are the publication of false material, and the publication of statements harmful to reputation. The former will cover statements about a person that are false and that a reasonable person would conclude are likely to cause harm. The latter will cover statements that a reasonable person would conclude are likely to harm the reputation of the victim, and likely to cause any other additional harm to the victim.
These two categories of harm differ slightly from the others in terms of the remedies victims may seek. I will touch more on this later.
There have been suggestions to include two new categories, namely sexual grooming, and the publication of online material that encourages or promotes suicide or acts of self-injury.
These harms have not been included at this stage, because, firstly, these harms are already addressed through other measures in our broader online safety framework, through the Penal Code for sexual grooming, and the Broadcasting Act for online material that encourages or promotes suicide or acts of self-injury.
Secondly, the OSRA Bill is designed in a way that seeks to stop online harm from occurring, such as through the removal of content. This is made possible because the online harm is identified through a single, discrete post, like the 13 categories of harms I described earlier.
In contrast, sexual grooming often occurs over the course of communication or exposure to various pieces of content, where each individual piece of content may not be harmful in and of itself.
Thirdly, OSRA is designed in a way that relies on user reports. Recourse can only be provided in cases where the victim recognises that they are victims. Unfortunately, it is often the case that victims of sexual grooming, or consumers of self-harm and/or suicide content, are not aware that they are victims. However, in such cases where these victims do come to the Commissioner for assistance, the Commissioner will work with the relevant agencies, including the Police, to provide the necessary assistance.
Members might observe that not all of these 13 categories are new, and some of them are already being addressed by existing law.
For example, online harassment is already being covered by the Protection from Harassment Act (POHA).
The Penal Code, as well as the Criminal Law (Miscellaneous Amendments) Bill, which was debated in Parliament earlier, both address intimate image abuse. Such content might also be covered as egregious content under the Broadcasting Act (BA). The suggested additions to the categories of harm would also be covered by the Penal Code and the BA.
The group harms have also already been addressed by the Maintenance of Religious Harmony Act (MRHA), and the Maintenance of Racial Harmony Act (RHA), for groups distinguished by religion or race.
The common law action of defamation, as supplemented by the Defamation Act, provides individuals with an avenue of recourse for defamatory statements, or statements which harm a person’s reputation.
OSRA will complement these existing laws.
Some laws address harms that affect the public interest – such as the MRHA, the RHA, and the BA.
Others provide individual remedies through legal action in courts, such as POHA. However, victims have shared that seeking legal remedies is often a lengthy, and expensive legal process, which often deters them from seeking recourse.
This can be seen from SHE’s 2023 Online Harms Report, which showed that 28% of the respondents who decided not to take legal action did so due to the cost.
The same SHE report also found that most respondents preferred the swift and permanent removal of content over taking legal action.
OSRA is intended to complement these existing legislations by expanding the scope of harms covered, through including new harms such as inauthentic material abuse and non-consensual disclosure of private information, and by allowing victims to seek recourse in a simple and timely manner.
We acknowledge that the internet evolves rapidly and the online harms ecosystem may change drastically over short periods, with new harms emerging, or with bad actors finding new ways to cause harm. Thus, OSRA has been designed in a way to allow us to adapt to changes in the online harms ecosystem.
The Minister for Digital Development and Information will be able to prescribe additional types of online harms. This power will be vital in ensuring that we address emerging harms as soon as possible.
We will ensure that this power is used judiciously, and for harms that are particularly egregious to individuals, to prevent the unchecked expansion of the scope of OSRA.
The Office of the Commissioner of Online safety and the Online Safety Commission (OSC)
Part 2 of the Bill establishes the Office of the Commissioner of Online Safety. It will be responsible for administering the statutory reporting mechanism to provide timely relief for victims of online harms
The Commissioner of Online Safety will be appointed by the Minister for Digital Development and Information, and will be supported by Deputy Commissioners, and Assistant Commissioners.
While the Minister may provide broad guidance to the Commissioner, the Commissioner will be the final decision-maker for all cases.
We will also establish a new agency, called the Online Safety Commission or OSC, that will support the Office of the Commissioner for Online Safety.
The Commissioner may delegate the exercise of all of, or any of, the functions or powers of the Commissioner to the officers of the OSC, to allow them to effectively support the Commissioner. This includes the power to issue directions, which I will speak more on later.
However, the Commissioner may not delegate the power of appointment or delegation, or the power to issue advisory guidelines.
The OSC will be administratively supported by the Infocomm Media Development Authority.
As announced by Prime Minister Lawrence Wong during his speech at the Smart Nation 2.0 Launch in October 2024 and reiterated by Minister Josephine Teo earlier in March this year, the OSC will be set up in the first half of 2026.
We have also considered the importance of transparency. To that end, the Commissioner will consider publishing regular reports for public awareness on online harms and the Commissioner’s work, which may include information on aggregated caseloads, and anonymised information, insofar as these do not retraumatise victims.
Reporting Process and Case Assessments
Let me now say more about the reporting process, which, as I shared earlier, is governed by Part 4 of the Bill.
With your permission, Mr Speaker, may I ask the Clerk to distribute a set of handouts to the Members? Members may also access these materials through the SG PARL MP mobile app.
Members may refer to Handout 2 for an overview of the user journey with the OSC.
First, prior to submitting a report to the OSC, victims will generally be required to report the online harm to the platforms.
While we are setting up the OSC to provide timely relief to victims of specified online harms, platforms remain the first port of call.
Platforms must continue to play an active role, and take responsibility for the safety of their users online.
Where platforms fail to act on online harms within 24 hours, victims can then file a report to the OSC.
The requirement to first report to platforms before reporting to the OSC will, however, not be required for certain egregious harms, such as intimate image abuse and image-based child abuse. This step will also not be required for doxxing.
Victims of these harms can submit a report directly to the OSC.
Our view has always been that platforms should take responsibility for keeping their users safe online. This is evident from the approach we have taken under the Broadcasting Act.
For example, under the Code of Practice for Online Safety – Social Media Services, designated social media services (DSMSs) with significant reach or impact in Singapore are already required to submit annual reports to be published on IMDA’s website. These reports contain information on the DSMSs’ measures to combat harmful and inappropriate content, and metrics such as the number of reports received from users, as well as response times to act on these reports.
There are also other baseline requirements.
To be eligible to submit a report, victims must be Singaporean citizens, Permanent Residents, or have a prescribed connection to Singapore. We intend for this to cover foreigners who are residing in Singapore for the long-term at the onset.
Where victims are under the age of 18, the parent or guardian of the victim may also submit a report on their behalf.
For example, earlier in November 2024, we saw deepfake nude photos of students at the Singapore Sports School being created and circulated by the student-athletes.[1] In such a case, both the student victims, and the parents of the victims depicted in the deepfakes would be able to submit a report to the OSC.
We have also considered situations where a victim strongly prefers that the report be filed for them. In such situations, victims will be able to authorise other persons to file reports on their behalf – this includes authorising employers, or public agencies.
For example, a hospital, if authorised, may file a report on behalf of a healthcare worker who is a victim of a specified online harm.
In the spirit of the reporting mechanism, the OSC will only assess reports submitted. The OSC will not actively monitor and identify cases of online harms.
Reports will be submitted through the OSC’s website, which will be designed with the user in mind.
We are consulting other agencies and third-party organisations, such as SHE, in designing the website and the reporting form, to take into account user-centric language.
Accounting for victims’ needs early is important, as survivors of online harms may sometimes avoid seeking external support to avoid re-traumatisation.[2]
The OSC’s website will also provide victims and other persons with access to resources on online harms, and advice on how to keep themselves safe online. This will include, for example, information on the different types of online harms and what to do when you have experienced them.
The OSC is also exploring partnerships with third party organisations which victims may be referred to for support, such as counselling services or further resources. The details of this are still being worked out, and we will provide more details in due course.
Each report will be assessed on its own merits, and the OSC will be able to act on the face of the report submitted. This is to ensure that the OSC can move fast to address the online harm.
Where necessary, such as where the information provided in the report is unclear, the Commissioner will be empowered under Part 6 to conduct investigations, to better identify the relevant facts of each case.
The OSC will also develop internal practices, and ensure that case officers are trained to handle each case sensitively.
Directions
Let me turn now to Directions, which are listed in Part 5 of the Bill.
The Commissioner will be empowered to issue directions to stop online harms from continuing to occur, or to prevent further online harms from affecting the victim where there is reason to suspect that the online harm was conducted in respect of the victim or the victim group, as the case may be.
The threshold for the issuance of directions is modelled after the threshold set for the police to take protective action in the Criminal Procedure Code, and the Online Criminal Harms Act.
We had considered raising the threshold for the issuance of directions to “reasonable grounds to believe”. However, we ultimately landed on “reason to suspect”, to ensure that online harms can be stopped in a timely manner.
We will be putting in place oversight mechanisms to ensure that OSC directions are only issued where appropriate. I will speak more on the oversight mechanism later.
The OSC may issue directions to three different parties – the Communicator of the online harm, the Administrator of a group or location where the online harm occurred, and the Platform on which the online harm occurred.
Broadly, Communicators and Administrators may be issued directions that require them to remove specified material or disable a specific location, to restrain them from posting certain types of content or carrying out certain types of conduct, or to require them to put up a victim’s reply.
Administrators may also be issued other directions, such as directions that require them to put up a label to warn visitors that the online location has been subject to previous OSC directions, or to restrict access by a Singapore account to the online location managed by them.
Platforms may be issued directions that require them to prevent end-users in Singapore from accessing specified content or online locations, to restrict interactions between an account and end-users in Singapore, to ban a Singapore account or to post a victim’s reply on its service.
As I mentioned earlier, two of the categories of harm – publication of false material and publication of statements harmful to reputation – differ slightly in terms of remedies that victims may receive. Generally, victims of these harms will only be able to seek a Right-of-Reply direction, to allow them to have their side of the story heard.
We will be introducing new types of directions requiring recipients to act on content that can be distinguished by unique identifiers, such as usernames, keywords or hashtags.
The OSC may also issue what we consider as enhanced directions to certain prescribed online services. These enhanced directions impose additional requirements on the prescribed online service, and may require the recipient to act on identical online harmful material, to take further steps to prevent online harm from occurring in the future, or to reduce engagement of its users with a class of material.
Members may refer to Handout 3 for the full list of directions that the OSC may issue, and Handout 4 for illustrations of online harmful activity and possible OSC directions.
In deciding whether to issue a direction, and the type of direction to be issued for a case, the OSC may consider a basket of factors, including:
The degree of the harm caused or likely to be caused;
The number of persons harmed or likely to be harmed
The manner and circumstances in which the online harmful activity occurred
Whether the conduct of the online harmful activity was reasonable;
Such as when a comment or post would be considered to be a “fair comment”.
The likelihood of further online harmful activity being conducted; and
Whether the direction would be contrary to any public interest
These factors provide the OSC with much needed flexibility, to ensure that appropriate action is taken in every case.
The OSC will also publish guidelines detailing the factors that the OSC will consider in its decision-making process. Such guidelines will also include illustrative examples of when the OSC will, or will not act.
In cases where the OSC is made aware of non-compliance with its directions, the OSC may take further escalatory actions. These actions include the issuance of Orders following non-compliance, such as:
Access Blocking Orders, which may be issued to providers of internet access services, to disable Singapore end-users' access to an online location; or
App Removal Orders, which may be issued to providers of app distribution services, to remove the specified app from the Singapore app stores.
As these orders may affect all users in Singapore since they will no longer be able to access the apps or websites, they may only be used after careful consideration.
Offences, Penalties, and the Online Harms Remedial Initiative
Part 8 of the Bill provides for offences and enforcement.
The directions and orders issued by the OSC are legally binding, and non-compliance with these directions and orders will be a criminal offence.
Where there has been non-compliance, the Commissioner will be empowered to conduct further investigations, such as requesting information or documents from individuals.
These are the same Part 6 investigative powers that the Commissioner may exercise when assessing reports in the first instance.
In developing the penalties under the OSRA Bill, we have referenced other existing legislation, such as the Broadcasting Act, the Online Criminal Harms Act, and the Penal Code.
For example, the penalty for non-compliance with directions will be a fine of up to $20,000, or imprisonment for a period not exceeding 12 months, or both, for individuals. Individuals will also be subject to a continuing fine of up to $2,000 for each day the offence continues after conviction. Entities will be subject to a fine of up to $500,000 and a continuing fine of up to $50,000 for each day the offence continues after conviction.
The provision of false information to the Commissioner or the OSC will also be an offence. Individuals found guilty of this offence will be subject to a fine of up to $20,000, or imprisonment for a period not exceeding 12 months, or both. Entities will be subject to a fine of up to $50,000.
In some cases, it may be more appropriate to focus on rehabilitating the perpetrator rather than prosecuting them for non-compliance with an OSC direction. To that end, the Commissioner may put in place an Online Harmful Activity Remedial Initiative.
This initiative could include the completion of volunteer programmes by the perpetrator, which may be taken into account when considering prosecution for non-compliance with the OSC’s directions.
Oversight Mechanisms
As I mentioned earlier, we will be putting in place oversight mechanisms, which will be governed by Part 7 of the Bill. These mechanisms ensure that the OSC will be able to move quickly, and with confidence on the face of the information it has received.
Victims, recipients of directions or orders, and other prescribed persons will have access to a two-step appeal process. Eligible persons may first apply to the Commissioner to reconsider the OSC’s decision. Thereafter, eligible persons may appeal against the OSC’s reconsidered decision to an independent appeal panel that will be appointed by the Minister for Digital Development and Information.
For the reconsideration process, the OSC will re-assess the relevant case afresh, but may also take into account any new information that may be presented by the relevant parties after the initial assessment was conducted.
Applicants will be able to submit such new information to the OSC.
When a decision has been made, the OSC will inform the applicant and other affected parties of their reconsidered decision. The OSC may affirm, revoke, vary or substitute any earlier decision, direction or order issued.
The appeal panel will consist of individuals from academia, society, and industry, across different areas of expertise, and will focus on assessing whether a specified online harm had occurred, and whether the reconsidered decision made by the OSC is proportionate and justifiable.
Where the victim, recipient or prescribed person is dissatisfied with the OSC’s reconsidered decision, they may submit an application to the appeal panel, to appeal against the OSC’s reconsidered decision.
Similar to the reconsideration process, the appeal panel will be able to affirm, revoke, vary or substitute decisions of the OSC in relation to the issuance or non-issuance of a direction.
The appeal panel will also be able to hear appeals on the issuance of an order following non-compliance.
Each individual will be given one chance to have their case heard by the independent appeal panel. Should the individual continue to be dissatisfied with the outcome of the appeal, they may seek judicial review.
Implementation Plans
The establishment of a new office and the statutory reporting mechanism is a monumental task – one that will require close collaboration across Government. We have to do the OSC right, so that we can do right by the victims.
This is why we will be implementing the reporting mechanism in phases.
What this means is that we will be bringing the first five harms I shared about earlier into force within the first six months after the OSC opens its doors.
These are: intimate image abuse, image-based child abuse, online harassment (including sexual harassment), doxxing, and online stalking
The rest of the harms will follow progressively.
These five harms are prioritised for a reason. They represent the most prevalent and serious harms faced by Singapore users online. These are also the online harms that Singaporeans are the most concerned about.
For example, The Institute of Policy Studies’ 2025 Online Safety Study showed that targeted harassment was seen by Singaporeans as “highly harmful”.[3]
Members may refer to Handouts 1 and 5, for a broader overview of recent survey results and studies relating to online harms.
This will allow us to better manage the OSC’s caseload, and to properly develop the necessary guidelines, frameworks and capabilities, to ensure that the OSC’s decisions are consistent, and appropriate, in all cases.
Conclusion
Mr Speaker, allow me to say a few words in Malay.
Keperluan untuk menyediakan program-program dan bahan-bahan sokongan yang mudah diakses dan tepat pada masanya kepada mangsa bahaya dalam talian bukanlah sesuatu yang baharu. Dari tahun ke tahun, kami telah mengambil langkah-langkah untuk mendidik dan lebih memperkasa warganegara kita untuk bertindak terhadap bahaya dalam talian. Contohnya, pada 2021, kami telah menubuhkan Perikatan untuk Bertindak (AfA) Sunlight bagi menangani bahaya dalam talian.
Selama setahun, AfA tersebut telah menganjurkan kempen-kempen bagi meningkatkan kesedaran orang ramai terhadap bahaya dalam talian serta kesannya. Kempen-kempen itu turut melengkapkan golongan belia agar dapat memberi sokongan yang lebih baik kepada rakan-rakan sebaya mereka yang terjejas.
Selaku pengerusi bersama AfA berkenaan, saya dimaklumkan oleh rakan-rakan kongsi kami tentang pengalaman yang dilalui ramai mangsa. Inilah yang mendorong saya, hingga ke hari ini, untuk membina ruang digital yang lebih selamat.
AfA Sunlight turut memberikan inspirasi kepada ramai anggotanya untuk meneruskan usaha murni membantu para mangsa. Salah satu contoh ialah penubuhan SHECARES@SCWO, sebuah kerjasama antara SG Her Empowerment dan Majlis Pertubuhan Wanita Singapura (SCWO). Ia merupakan pusat sokongan sehenti pertama Singapura untuk mangsa bahaya dalam talian.
Usaha mempertingkatkan keselamatan dalam talian memerlukan penglibatan setiap lapisan masyarakat, dan Pemerintah bersedia tampil untuk menggiatkan usaha ini. Dengan tertubuhnya Suruhanjaya Keselamatan Dalam Talian, kami akan dapat menyediakan tindakan susulan yang tepat pada masanya kepada para mangsa, membantu memintas anasir berbahaya secepat mungkin dan membina ekosistem sokongan yang lebih teguh untuk para mangsa.
OSRA, and the OSC are just the one of the many steps that we are taking to enhance online safety.
I am heartened to note that online safety is a matter that both sides of this House are passionate about, and I invite all members of this House and the public to continue our conversations on how we can better enhance online safety.
Thank you.
[1] Police investigating deepfake nude photos of Singapore Sports School students - CNA
[2] SHE’s 2025 “404 HELP NOT FOUND” Report, p18
[3] Institute of Policy Studies (2025), Singaporeans back stronger safeguards and faster remedies for online harms, IPD study finds
