Online Safety Framework
Ireland’s Online Safety Framework is currently made up of three laws which are targeted at online platforms.
These are the:
- the Digital Services Act (DSA),
- the Online Safety and Media Regulation Act 2022 (OSMR) (which is the basis for our Online Safety Code) and
- the EU Terrorist Content Online Regulation (TCOR)
The aim of this framework is to reduce the risk of people (especially young people) being exposed to illegal or harmful content online.
Digital Services Act
Jurisdiction: EU law
Targeted at ISPs (Intermediary Service Providers) that are based in Ireland, including VLOPs and VLOSEs (ISPs that have over 45 million users).
- VLOPs based in Ireland: Apple, Google (YouTube, Google Play, Google Maps, Google Shopping), Shein, LinkedIn, Meta (Facebook, Instagram), Microsoft, Pinterest, TikTok, Temu and X)
- VLOSEs based in Ireland: Bing, Google
Purpose of law: To ensure a safe, predictable and trusted online environment, protect users from illegal content online and to provide greater online safety while promoting innovation.
Penalty or fine: 6% of annual global turnover
Online Safety Media Regulation Act
Jurisdiction: Irish law
Targeted at Video-sharing platforms that are based in Ireland.
- Facebook, Instagram, YouTube, Udemy, TikTok, LinkedIn, X, Pinterest, Tumblr, Reddit
Purpose of law: To protect children and the public from harmful and illegal content .
Penalty or fine: €20 million or 10% of turnover
Terrorist Content Online Regulation
Jurisdiction: EU law
Targeted at Hosting Service Providers (HSPs) based in Ireland. HSPs are providers that store information that has been provided to it (e.g. social media platforms, cloud services, web hosting services).
Purpose of law: To counteract the dissemination of terrorist content.
Penalty or fine: 4% of annual global turnover
Obligations of platforms
Some of these obligations apply to all platforms, while some only apply to the certain platforms
Obligations | Digital Services Act | Online Safety Media Regulation Act | Terrorist Content Online Regulation |
---|---|---|---|
Single point of contact made available | Yes | Yes | |
Reporting functions are easy to find and easy to use | Yes | Yes | |
Appeals process if you’re unhappy with content moderation decision | Yes | Yes | |
External audits on risk mitigation | Yes | ||
Harmful and illegal content banned in their Terms and Conditions | Yes | ||
Transparency around algorithms (choice given on content you are shown) | Yes | ||
No targeted ads to children | Yes | Yes | |
Trusted flaggers who fast track reporting | Yes | An Garda Síochána is the only authority in Ireland who can oblige a platform to take down content | |
Media literacy initiatives | Yes | Yes | |
Out-of-court settlement option | Yes | ||
Nominated body to report to that will send information on to regulator | Yes | ||
Easy to understand Terms and Conditions | Yes | Yes | |
Advertisements (including political ads) that are clearly labelled | Yes | Yes |
Special precautions for children
Some of these obligations apply to all platforms, while some only apply to the certain platforms
Precautions for children | Digital Services Act | Online Safety Media Regulation Act | Terrorist Content Online Regulation |
---|---|---|---|
No targeted ads to children (if known) | Yes | Yes | |
Age assurance | Yes | Yes | |
External audits on risk mitigation | Yes | Yes | |
Parental controls | Yes | Yes | |
Illegal content restricted in the Terms and Conditions | Yes | Yes | |
Illegal and harmful content restricted in the terms and conditions | Yes | Yes | |
Easy to understand Terms and Conditions | Yes | ||
Media literacy initiatives | Yes | Yes |