Ireland’s media and internet watchdog, Coimisiún na Meán, has adopted and published an Online Safety Code that will apply to video-sharing platforms headquartered in the country from next month — including the likes of ByteDance’s TikTok, Google-owned YouTube, and Meta’s Instagram and Facebook Reels.
Under the Code, in-scope platforms are required to have terms and conditions that ban uploads or sharing of a range of harmful content types — including cyberbullying; promoting self-harm or suicide and promoting eating or feeding disorders, in addition to banning content that incites hatred or violence, terrorism, child sex abuse material (CSAM), and racism and xenophobia.
The aim of the Code is to address content types that are not directly in scope of the European Union’s Digital Services Act (DSA), Coimisiún na Meán spokesman, Adam Hurley confirmed.
The latter, a pan-EU law, has applied broadly since mid February and is focused on online governance of illegal content (e.g. CSAM), rather than tackling the wider sweep of harms Coimisiún na Meán’s Code aims to address.
“One of the thoughts behind the Online Safety Code is dealing with content which is more harmful rather than illegal,” Hurley told us, adding: “What we’ve done is broaden the scope to harmful content that they must prohibit uploading of and then act on reports against those terms and conditions.”
“It’s a prohibition on uploading in their terms and conditions. So they have to prohibit the uploading of these types of content in their own terms and conditions, and then they’ll have to enforce those terms and conditions,” he added.
The Code will directly apply only to video services provided to users in Ireland, including several major social media platforms that fall under its scope due to their regional headquarters being in the country. However, tech firms may choose to apply the same measures across the rest of the region to streamline compliance and avoid awkward questions about inconsistencies in content standards.
Notice and takedown
Another noteworthy element here is that EU law prohibits imposing a general monitoring obligation on platforms. So Ireland’s Online Safety Code will not require platforms to deploy upload filters, per Hurley. Rather, he confirmed it’s essentially an expansion of the existing notice and take down approach — by allowing users to also report harmful content and expect platforms will remove it.
Much like the DSA, the Code therefore requires platforms to have ways for people to report the aforementioned harmful content types in order that they can act on reports in line with their T&Cs.
Age assurance for porn
In further requirements, the Code mandates that video sites which permit pornographic content or gratuitous violence in their T&Cs must apply “appropriate” age assurance (or age verification) in a bid to ensure minors do not access inappropriate content.
Hurley said there are no approved age assurance technologies per se; rather, the regulator will assess what’s appropriate on a case-by-case basis.
The Code also requires video-sharing platforms that carry such content to establish user-friendly content rating systems.
Platforms must also provide parental controls for any content which may “impair the physical, mental, or moral development of children under 16,” as the Coimisiún na Meán’s press release puts it.
Recommender systems
On recommender systems, the Irish regulator previously considered requiring video-sharing platforms to turn off profiling-based content recommendations by default as a safety measure — which could have led to a scenario where TikTok was forced to switch off its algorithm by default.
However, after a consultation last year, the measure did not make it into the final Code, the Coimisiún na Meán’s spokesman confirmed. “It was considered as a potential supplementary [to the Code] but we’ve come down on the position that the best way to deal with recommender systems — the potential harm of recommender systems — is through the [EU’s] Digital Services Act,” he told TechCrunch.
We’ve asked the regulator how the Code will therefore mitigate harms driven through algorithmic amplification, which is another stated aim.
The finalized Code forms a part of Ireland’s overall Online Safety Framework which aims to ensure digital services are accountable for protecting users from online harm — which falls under the country’s Online Safety and Media Regulation Act.
The EU’s DSA applies around the bloc so is also in force in Ireland, with the Coimisiún na Meán responsible for enforcing the regulation’s general rules on any locally headquartered companies in scope — in addition to overseeing the new Online Safety Code.
Commenting in a statement, Ireland’s Online Safety Commissioner, Niamh Hodnett, said: “The adoption of the Online Safety Code brings an end to the era of social media self-regulation. The Code sets binding rules for video-sharing platforms to follow in order to reduce the harm they can cause to users. We will work to make sure that people know their rights when they go online and we will hold the platforms to account and take action when platforms don’t live up to their obligations.”
In another supporting statement, executive chairperson of Coimisiún na Meán, Jeremy Godfrey, added: “With the adoption of the Online Safety Code, all the elements of our Online Safety Framework are now in place. Our focus now is on fully implementing the Framework and driving positive changes in peoples’ lives online.
“Our message to people is clear: if you come across something you think is illegal or against a platform’s own rules for what they allow, you should report it directly to the platform. Our Contact Centre is available to provide advice and guidance to people if they need help.”
Child safety concerns are behind a growing number of online safety initiatives on both sides of the Atlantic in recent years. This includes the U.K.’s Online Safety Act (which passed into law just over a year ago) and an Age-Appropriate Design Code (which begun being enforced in the U.K. in fall 2021). A child safety-focused bill is also progressing in the U.S. (KOSA). It was proposed back in 2022.
Source: Techcrunch