Legislation explained
Digital Services Act (DSA)
What is it?
The Digital Services Act is a set of EU rules that are applicable in Ireland. The DSA aims to build a safer and fairer online world. It has rules that equally protect all users in the EU, both in relation to illegal goods, content or services, and their fundamental rights. The DSA aims to resolve systemic issues on platforms that result from the design or functioning of their service or related systems.
Who does the DSA apply to?
The DSA applies to all online intermediary service providers (ISPs) that provide services in the EU, for example, online marketplaces, social networks, content- sharing platforms, app stores, and online travel and accommodation platforms.
Small and micro-enterprises are exempted from some rules that might be more burdensome for them.
Additional obligations apply to very large online platforms (VLOPs) and very large search engines (VLOSEs), which have more than 45 million active users.
VLOPs based in Ireland are Apple, Google (YouTube, Google Play, Google Maps, Google Shopping), Shein, LinkedIn, Meta (Facebook, Instagram), Microsoft, Pinterest, TikTok, Temu and X.
VLOSEs based in Ireland are Bing and Google Search.
DSA obligations for online platforms
Some of these obligations apply to all ISPs, while some only apply to the larger ISPs.
Online platforms should have the following:
- Easy to understand terms and conditions.
- Reporting mechanisms that are easy to find and easy to use.
- Appeals processes (i.e. platforms must provide a reason why a post is/isn’t removed and give you a way to appeal this decision).
- Risk assessment processes to assess, mitigate and respond to risks resulting from the use of their services.
- External audits on risk mitigation measures (i.e. how platforms are keeping children safe, e.g. parental controls).
- Annual transparency reports on content moderation.
- Transparent advertising and recommender systems.
What is the role of Coimisiún na Meán?
Coimisiún na Meán monitors Irish-based service providers and ensures they are following the rules under the DSA.
If you believe that an online platform is failing in its above obligations, you can report to us. To learn more about how to report and what you need to report, look at our online reporting section. Our Advice Centre may need more information from you and will check a) if you are based in Ireland and b) if the platform you are reporting is based in Ireland. If you or they are not based in Ireland, we might need to send you to a regulator in a different country (where the platform is based).
We have the power to start an investigation if platforms systemically break the rules under the DSA. Investigations can take time, but if it is found that a platform did break these rules, we can issue fines of up to €20 million or up to 6% of turnover, whichever is greater.
Online Safety Media Regulation Act (OSMR)
What are these?
- The Online Safety Media Regulation Act is an Irish law that established Coimisiún na Meán as the regulator of traditional and online media in Ireland.
- Following a public consultation and taking into account a high number of submissions – including from members of the public – we developed the Online Safety Code which is a binding set of rules for video-sharing platforms.
- The Online Safety Code (OSC) was published in October 2024.
Who does the Online Safety Code apply to?
The OSC applies to providers of video-sharing platform services (VSPs) that are based in Ireland. VSPs are a type of online video service which allows users to upload and share user-generated videos with the public. The designated VSPs providers are YouTube, TikTok, Pinterest, Reddit, Tumblr, Facebook, Instagram, Udemy, LinkedIn and X.
What is not allowed under the Online Safety Code?
- Illegal content such as child sexual abuse materials, child trafficking, terrorist content, offences concerning racism/xenophobia or incitement to violence or hatred against a group of persons (language, colour, ethnicity, religion, age or sexual orientation).
- Content that is not illegal but harmful to all and especially children: cyberbullying, pornography, gross violence, dangerous challenges, promoting eating disorders, promoting suicide or self-harm content – where the harm creates a risk to a person’s life or health.
What are the obligations of the video-sharing platform service providers in relation to the Online Safety Code?
VSPs providers must make sure that illegal and harmful content (see examples mentioned above) is not allowed, and that this is outlined in their Community Guidelines/Terms and Conditions. They must also provide tools such as age assurance and parental controls, where appropriate, as well as supporting or delivering media literacy initiatives.
What happens if video-sharing platforms don’t comply with the Online Safety Code?
We can investigate this and impose fines of up to €20 million or 10% of turnover, whichever is greater.
Terrorist Content Online Regulation (TCOR)
What is it?
The Terrorist Content Online Regulation (TCOR) is a set of EU-wide rules that aims to stop the sharing of terrorist content online and to allow for the speedy removal of this content. Terrorist content includes the promotion, glorification and encouragement of terrorist activity, as well as calls to action for others to engage in such acts.
Who does the TCOR apply to?
This applies to hosting services providers (HSPs). If you store data that has been provided to you, you may be a hosting service provider. For example, if you download or upload an image or text, this data is stored somewhere. If a platform/organisation hosts and stores data, the TCOR applies to them. It is important to note that the TCOR doesn’t apply to private messages e.g. WhatsApp.
Who is involved in enforcing the TCOR?
- An Garda Síochána is the only organisation (or competent authority under TCOR) in Ireland that can force a platform to remove terrorist content – this is known as a Removal Order. The platform has one hour to remove the content once the Removal Order has been received.
- At the same time, An Garda Síochána will send the Removal Order to the competent authority in the country where the platform is established.
- The HSP has the opportunity to appeal the Removal Order, if they consider that it is inappropriate for whatever reason.
- Under the TCOR, since 30 November 2023 Coimisiún na Meán is a ‘competent authority’ in Ireland to receive reports about violations of the TCOR from competent authorities in other Member States.
What are the obligations of the HSPs?
There are specific measures that HSPs need to take to prevent the dissemination of terrorist content. For example, putting in place effective technical and operational measures to quickly remove or disable access to terrorist content, ensuring there are user-friendly reporting mechanisms and having a specific point of contact for Removal Orders.
What happens if an HSP doesn’t remove the terrorist content?
If an HSP doesn’t remove the terrorist content after a Removal Order has been issued by An Garda Síochána, the platform may have breached its obligations under the TCOR. Coimisiún na Meán can impose fines of up to 4% of their annual global turnover.
What do I need to do if I see terrorist content?
If you see a post online that you think might contain terrorist content, report it immediately to the platform. You should also report this to An Garda Síochána and to us.