A safer, more secure social media experience for Singaporeans is at the core of the Online Safety (Miscellaneous Amendments) Bill newly-passed in Parliament on Nov 9.
It’s necessary because there really is
a lot of more and more egregious content out there on social media.
For example: content advocating terrorism, suicide and self-harm, violence (including sexual violence), child sexual exploitation, content posing public health risk, and content likely to undermine racial and religious harmony.
The amended Bill now covers these. It lets the Infocomm Media Development Authority (IMDA) fine and/or issue a rectification direction to social media platforms to remove this content and make them liable if they fail to protect local users from online harms.
Failure to comply may attract a fine of up to $1 million, or have their social media services blocked in Singapore.
Social media services must comply with these rectification directions speedily — it’s to vaccinate against harmful content spreading like a viral pandemic.
“IMDA’s directions will stipulate a specific timeline for disabling access. For egregious content that could cause serious harm, the timeline would generally be within hours,” said Minister for Communications and Information Josephine Teo.
“To ensure that safety is upheld for Singapore users, we need Online Communication Services to be held accountable. Equally, we need the support of everyone in the community to keep each other safe online.”
Stemming the spread
So, a useful analogy for understanding how the Bill now works is like protocols equipping medical professionals to act quickly and stem the spread of a virus.
Or like gearing up firefighters to rush to a fire and prevent the damage, as Minister Teo outlined in an earlier Parliament sitting on Nov 8.
“Members may recall that in the early days of the Covid-19 pandemic, supermarkets were purportedly running out of toilet paper. A social media post surfaced, suggesting that people use the Bible or the Quran as toilet paper. This post was religiously very offensive, and denigrated two religions in Singapore,” said the Minister.
“However, it was not moderated nor removed by the platform concerned. IMDA had to step in to engage the platform, and the platform eventually disabled access to the post.”
Other egregious content on non-designated Social Media Services, which are not subject to the Code of Practice for Online Safety, also exists, the Minister added.
“In May last year, a poll published on a Social Media Service sexualised local female Islamic teachers, asked users to rank them, and further promoted sexual violence against them. The post went viral, and the modest reach of this particular service received a sudden big boost.”
“It not only caused great distress to the individuals involved, but also unsettled many others in our community.”
The amended Bill makes intervening against these types of content timely and targeted. It guards our nation’s diverse communities.
Empowering the public
In fact, the Bill responds to growing public concerns about harmful content on social media as well as public expectations that social media services offer more protection from this harm.
“Parents, in particular, were concerned over viral social media content which featured dangerous pranks and challenges, harmful advertising, cyberbullying, and explicit sexual content,” noted Minister Teo about public feedback from the Government’s July-August engagement sessions.
No wonder then, that social media users can now start to stop the spread/fight the fire.
“IMDA will also require social media services to act on user reports in a timely and diligent manner that is proportionate to the severity of the potential harm,” said the Minister.
“In particular, timelines must be expedited for content and activity related to terrorism.”
And while social media are vast and evolving, the Government’s codes and laws for keeping Singaporeans safe; those will keep pace.
“Ultimately, we must recognise that there is no single measure that will assure us of online safety,” said Minister Teo.
“We will need laws, codes, education, user reporting and a whole range of interventions. We will also need to keep updating our measures to deals with new risks.”
Cover photo credit: Pexels, Tracey Le Blanc