MDDI Surveys Show More Than 4 In 5 Respondents Encounter Harmful Online Content; Two-Thirds Call For Stronger Regulation
10 October 2025
More than 4 in 5 (84%) Singapore residents reported encountering harmful online content in the past year. One in 3 (33%) reported also experiencing harmful online behaviour. About 2 in 3 (62%) called for stronger regulation to protect Singaporeans from online harms. These were the key findings of The Perceptions of Digitalisation Survey1 and Smart Nation Policy Perception Survey2 conducted by the Ministry of Digital Development and Information (MDDI) earlier this year.
Users’ Encounters with Harmful Online Content
The most frequent harmful online content encountered by the survey respondents were those supporting illegal activity (e.g. scams), those of sexual or violent nature4, cyberbullying and content causing racial or religious tension.
Table 1: Top 5 Frequently Encountered Types of Harmful Online Content (i.e., at least a few times a month in the past year)
S/No. | Category | Among respondents who encountered harmful online content in the past year % |
---|---|---|
1 | Content supporting illegal activity (such as scams or sale of prohibited items) | 33 |
2 | Sexual content | 26 |
3 | Violent content | 24 |
4 | Cyberbullying | 20 |
5 | Content causing racial or religious tension | 16 |
Respondents encountered harmful online content most frequently on Facebook, YouTube and Instagram as well as messaging apps WhatsApp and Telegram (refer to Annex A for other social media services where respondents encountered harmful online content). While this may be a result of the relative reach of each platform, the prevalence across all platforms is cause for concern.
Table 2: Encounters with harmful online content across platforms
S/No. | Platform | Among respondents who encountered harmful online content in the past year % |
---|---|---|
Social Media Services | ||
1 | 57 | |
2 | YouTube | 46 |
3 | 41 | |
Messaging Apps | ||
1 | 38 | |
2 | Telegram | 22 |
Users’ Experiences with Harmful Online Behaviour
Catfishing was the most common type of harmful online behaviour experienced by survey respondents, mostly on WhatsApp (56%) and Facebook (41%). This was followed by unwanted sexual messages and online harassment (refer to Annex B for other common types of harmful online behavior)
Table 3: Top 3 Most Common Types of Harmful Online Behaviour
S/No. | Category | Among respondents who experienced harmful online behaviour in the past year % |
---|---|---|
1 | Catfishing | 71 |
2 | Unwanted sexual messages | 27 |
3 | Online harassment | 16 |
Users’ Responses to Harmful Online Content and Behaviour
Survey respondents who encountered harmful online content tended not to report the content or user to the platform. More than 4 in 5 (82%) skipped or closed the content. One in 4 (23%) took no action. As for harmful online behaviour, 4 in 5 (79%) of those who experienced it blocked the user responsible. About half (46%) reported the content or user to the platform.
Such inaction could be because past experiences with reporting did not lead to timely actions.3
Whole-of-society Efforts to Tackle Online Harms
The Government, industry, and community have been working together to build a safer online environment.
The Government has taken the following legislative steps to protect Singaporeans from online harms.
IMDA introduced the Code of Practice for Online Safety – Social Media Services in July 2023, which requires designated social media services to put in place systems and processes to prevent Singapore users, particularly children, from accessing harmful content.
IMDA’s Code of Practice for Online Safety – App Distribution Services came into effect in March 2025. This requires designated app stores to minimise users’ risk of exposure to harmful content, especially children. The designated app stores must also put in place age assurance measures to prevent young users under 18 from accessing age-inappropriate apps. These measures will be implemented by 31 March 2026.
Minister Josephine Teo announced in MDDI’s Committee of Supply speech earlier this year that the Online Safety (Relief and Accountability) Bill will be tabled to establish a new Online Safety Commission by the first half of 2026. The Commission will enable victims to get timely help if they encounter online harms and hold perpetrators accountable.
2 in 3 (62%) respondents in MDDI’s Smart Nation Policy Perception Survey were supportive of stronger regulation to protect users from online harms, even if that resulted in less freedom for users in the online space. This shows that a clear majority accept that some regulatory restraints are necessary to protect Singaporeans from online harms like online scams, inappropriate content and misinformation. We will also work with platforms to enhance their safety measures to provide a safe experience for all users.
In order to help individuals better recognise online risks and cultivate safer digital habits, we will improve public education and outreach to make online safety resources more accessible, practical, and action oriented. In collaboration with community, industry and corporate partners, more ground-up initiatives such as workshops and webinars will be organised to educate the public on relevant topics.
1 The Perceptions of Digitalisation Survey was conducted with 2,008 Singapore Citizens and Permanent Residents aged 15 years old and above from November 2024 to February 2025. The sample was representative of the resident population by gender, age and race.
2 The Smart Nation Policy Perception Survey was conducted with 2,008 Singapore Citizens and Permanent Residents aged 15 years old and above from March to May 2025. The sample was representative of the resident population by gender, age and race.
3 Infocomm Media Development Authority (IMDA)’s Online Safety Assessment Report 2024 noted that designated social media services should improve the effectiveness and timeliness of their responses to user reports. Most of them took an average of five days or more to act on user reports of harmful content that violated their community guidelines, which was considerably longer than what was stated in their annual reports.