Key Takeaways:
- From 18 July 2023 onwards, the Code of Practice for Online Safety (“Code”) issued by the Info-communications Media Development Authority (“IMDA”) comes into effect.
- Designated social media services (“SMSs”) will have to comply with the Code. The current list of SMSs published by the IMDA includes Facebook, HardwareZone, Instagram, TikTok, Twitter and YouTube.
- The Code requires SMSs to enhance online safety in Singapore and curb the spread of harmful content on their services by implementing the mechanisms listed in the Code relating to user safety, user reporting and resolution, and accountability (including a requirement for annual reporting). Special measures should also be implemented to protect children in particular.
- A failure to comply with the Code may result in fines of up to SGD 1 million.
Background:
Following from our previous update on the passing of the Online Safety (Miscellaneous Amendments) Act, the IMDA released the finalised version of the Code and its accompanying Guidelines on 17 July 2023 together with a list of SMSs to which the Code applies. The IMDA also announced that the Code would take effect from 18 July 2023.
The Code is issued in accordance with the IMDA’s powers under section 45L of the Broadcasting Act 1994 (“Broadcasting Act”) to require providers of a “regulated online communication service” (which include SMSs) to comply with certain codes of practice.
The finalisation of the Code follows in the recent trend of online content regulations and online child safety regulations and policies worldwide, such as the United Kingdom’s Online Safety Bill, the European Union’s Digital Services Act, Ireland’s Online Safety and Media Regulation Bill, and Vietnam’s Decree No. 56/2017/ND-CP detailing the Law on Children and National Programme on Child Online Protection.
Who is impacted?
Only the following designated SMSs must comply with the Code:
- Facebook;
- HardwareZone;
- Instagram;
- TikTok;
- Twitter; and
- YouTube.
What are the new rules?
The Code is intended to compel SMSs to enhance online user safety (particularly for children) and curb the spread of harmful content on their service. There are several categories of “harmful content”, including sexual content, violent content, suicide and self-harm content, cyberbullying content, content endangering public health, and content facilitating vice and organised crime. The IMDA has also released the Guidelines on Categories of Harmful Content to illustrate what content may be considered harmful and/or inappropriate.
Broadly speaking, SMSs have to comply with obligations relating to (1) user safety; (2) user reporting and resolution; and (3) accountability.
- User Safety.In relation to ensuring user safety, SMSs will need to minimise end-users’ exposure to harmful content by implementing community guidelines, standards and content moderation measures, empowering end-users with tools to manage their own safety and exposure to such content, and proactively detecting and removing child sexual exploitation and abuse material and terrorism content. SMSs must also ensure that children are not targeted to receive content which may be detrimental to their well-being, that children are provided with differentiated accounts with default settings set to minimise exposure to and mitigate the impact of harmful and/or inappropriate content and unwanted interactions, and that children are provided with information related to online safety.
- User Reporting and Resolution.SMSs must provide individuals with an “effective, transparent, easy to access, and easy to use” mechanism to report concerning content or unwanted interactions to the SMS. SMSs must also take appropriate action to address and resolve these concerns.
- Accountability.SMSs must provide end-users with access to clear, easily comprehensible information such that the end-user will be able to judge the level of safety and the SMS’s safety-related measures, and to make informed choices. They must submit to the IMDA annual online safety reports detailing their online safety measures for Singapore end-users, and these reports will be published on the IMDA’s website.
Failure to comply with the Code may result in a fine of up to SGD 1 million.
What’s next?
As the Code has just come into effect, it remains to be seen how the IMDA intends to supervise compliance, apart from via the annual online safety reports submitted by the SMSs.
In line with the emphasis on children’s safety in the Code, the government has also very recently shown an intention to develop more regulations on children’s personal data as well. On 19 July 2023, the PDPC also launched the Public Consultation for Proposed Advisory Guidelines on the Personal Data Protection Act for Children’s Personal Data, which sets out the PDPC’s intention to develop a new set of advisory guidelines specifically addressing children’s data, with potential new requirements regarding data protection, data breach notification and more. The public consultation period will end on 31 August 2023.
Furthermore, on 18 July 2023, the Personal Data Protection Commission (“PDPC”), supported by the IMDA, issued a set of Proposed Advisory Guidelines on the Use of Personal Data in AI Recommendation and Decision Systems, which is currently open for public consultation. The Guidelines targets situations where machine-learning artificial intelligence models or systems utilise personal data under situations covered by the PDPA. The public consultation period will end on 31 August 2023. Read more about the new AI Guidelines here.
This article is produced by our Singapore office, Bird & Bird ATMD LLP. It does not constitute legal advice and is intended to provide general information only. Information in this article is accurate as of 19 July 2023.
For further information, please contact:
Jeremy Tan, Partner, Bird & Bird
jeremy.tan@twobirds.com