Jakarta, Indonesia Sentinel — Indonesia’s Minister of Communication and Digital Affairs (Komdigi), Meutya Hafid, announced that the country will implement the Content Moderation Compliance System ‘SAMAN’ starting February 2025. The initiative aims to protect citizens in the digital space, with a particular focus on safeguarding children.
SAMAN is designed to monitor and enforce compliance among electronic system providers (ESPs) such as Facebook, Instagram, TikTok, Google, X (formerly Twitter), YouTube, and others.
“SAMAN will be implemented in February to curb the spread of illegal content on digital platforms,” Meutya stated in a press release on Friday, January 24, as reported by CNN Indonesia. “Protecting society, especially children, from pornography, online gambling, and illegal loans is our top priority in creating a safe and healthy digital environment.”
Therefore, SAMAN will oversee violations including child pornography, online gambling, illegal financial activities (including unlicensed loan services), terrorism-related content, and unregulated food, drugs, and cosmetics.
Compliance Enforcement Stages
According to Meutya, SAMAN will ensure that electronic system providers (ESPs) adhere to regulations while fostering a secure digital space for the public. The enforcement process involves several stages:
- Takedown Order: ESPs are required to remove URLs specified in the order.
- First Warning Letter (ST1): ESPs must take action to remove content to avoid escalation to the next stage.
- Second Warning Letter (ST2): ESPs are mandated to submit a commitment letter agreeing to pay administrative fines.
- Third Warning Letter (ST3): Failure to comply may result in access termination or platform blocking.
Notifications for non-urgent content violations must be addressed within 24 hours, while urgent cases require action within 4 hours. Under a new regulation (Ministerial Decree No. 522/2024), ESPs that fail to comply with takedown orders will face administrative fines.
Protecting Vulnerable Groups
The ministry emphasized that children are among the most vulnerable to exploitation in the digital space. A UNICEF report highlights that 1 in 3 children globally has been exposed to inappropriate online content.
Data reveals a growing number of cases in Indonesia involving online sexual exploitation, human trafficking, and harmful content targeting minors. Many of these incidents stemmed from the misuse of technology and age-inappropriate device usage.
Between 2021 and 2023, the Indonesian Child Protection Commission (KPAI) received 481 complaints related to child victims of pornography and cybercrime, along with 431 cases of child exploitation and trafficking.
Therefore, the launch of Content Moderation Compliance System ‘SAMAN’ was expected to protect users especially children inside digital space.
Global Comparisons
Indonesia’s SAMAN aligns with regulatory measures adopted by other countries. The Ministry stated already taken regulatory comparison with countries that have successfully implemented the content moderations framework.
Germany, for instance, enforces its Network Enforcement Act (NetzDG), which requires social media platforms to remove illegal content within 24 hours. While Malaysia’s Anti-Fake News Act 2018 combats misinformation.
Read Also:
Indonesia Plans New Regulations to Restrict Children Social Media Use
With the launch of Content Moderation Compliance System ‘SAMAN’, Indonesia underscores its commitment to creating a safer digital environment while holding platforms accountable for their role in curbing harmful content. As the digital world continues to evolve, the success of this initiative could serve as a model for balancing technological innovation with public protection.
(Raidi/Agung)