What can I report?
If you come across illegal or harmful content on a platform, you should firstly report the content to the platform in question.
If you have difficulty submitting a report to a platform, or if you have concerns that the correct procedures for handling a report have not been followed by a platform, then you should contact us. We value every contact we get from the public. The more information and contact we get, the better we can do our job, whereby we engage with platforms to make sure they comply with these new rules. If, through our engagement and complaints from the public, we find that platforms are not following these rules, we can investigate this and, issue significant fines (e.g. depending on the circumstances, the fine could be 20 million euro or 10% of the annual turnover) .
Examples of what you can report to Coimisiún na Meán
Issues contacting, reporting content or making complaints to the platform
Under the Digital Services Act, reporting functions on platforms must be easy to use and easy to find. If you see illegal content on the platform and there is no way to report it, or if you have an issue and you can’t locate the reporting button, you can contact us. Our Advice Centre can advise you on where the reporting function is, or if there is no reporting function on the platform, you may be able to make a complaint to us.
Under the Digital Services Act, if you make a report about illegal content, you must get a reply from the platform in a timely manner, acknowledging they received your report. If you report illegal content and the platform doesn’t remove it, they need to tell you about their decision not to remove it. If you reported illegal content and you hear nothing back from the platform, this is against the rules and you can contact us to make a complaint.
Under the Digital Services Act, if you report illegal content, you must get a reply from the platform in a timely manner, acknowledging they received your report. If the platform thinks the content you have reported is either 1) not illegal or 2) doesn’t go against their terms and conditions, they may decide not to remove this content, but they must tell you about their decision not to remove it AND you must get a chance to appeal this decision. You should also be given the chance to appeal the decision if a platform removes your content, following a report from another person. You have six months from the date of a decision to lodge an appeal with the platform. You must be able to submit this appeal electronically and free of charge. If you don’t get a chance to appeal their decision, this is against the rules and you can contact us to make a complaint.
Under the Digital Service Act, an online service provider must provide a single point of contact for users to contact the provider electronically, allowing for direct and rapid communications.
The single point of contact must be user-friendly and allow users to choose means of communication, which cannot be fully automated. The information for users to easily find and communicate using a service’s single point of contact must be made public, easy to access, and kept up to date. See links to some of the largest online service providers.
If you believe an online service provider has not met its obligations in relation to its single point of contact, you can make a complaint to us.
The hosting service provider is obliged to promptly inform law enforcement or judicial authorities if it becomes aware of any information giving rise to a suspicion that a criminal offence involving a threat to the life or safety of a person or persons has taken place, is taking place or is likely to take place.
If you believe a hosting service provider has not met its obligations to notify law enforcement or judicial authorities in these circumstances, you can make a complaint to us.
An online platform must not influence, deceive or manipulate you through its design and online interface. A platform cannot influence you to make certain decisions, for example, by having pre-selected options, or by presenting you with a false sense of urgency. The design of a platform should not limit your ability to make free and informed decisions. You should also not be repeatedly asked to make a choice, for example, by presenting pop ups that interfere with your experience. Finally, it should not be harder for you to cancel a subscription than to subscribe.
If you believe an online platform provider has not met its obligations in relation to negatively influencing your use of the platform, you can contact us to make a complaint.
Issues regarding my content being removed/my account being suspended
If your content is illegal or does not comply with the platform’s terms and conditions, a platform can remove or restrict the visibility of your content, suspend or terminate your account or suspend, terminate or restrict payments to you. However, under the Digital Services Act, if a platform takes any of these actions and if it has your email address, the platform must provide you with a statement of reasons.
The statement of reasons must be clear, easy to understand and as precise as possible. It must contain:
– Information on whether the content has been removed, restricted, disabled or demoted or whether payments have been suspended or terminated
– The geographical area to which the decision applies
– The duration of the decision
– The circumstances of the decision, including whether it came from the platform’s own investigations or from a report of illegal content and, where strictly necessary, the identity of the person who made the report
– Information on the use of automation in identifying the content or in making the decision
– Why the content is considered illegal, with reference to the legal ground (if the decision relates to illegal content)
– Clear and user-friendly information on redress options
If a platform does not provide you with a sufficient statement of reasons, you can contact us to make a complaint.
A platform must provide a way for you to make a complaint if you are not satisfied with the platform’s decision to restrict or remove content, suspend or terminate your account or suspend or restrict your ability to receive payments.
The complaint system must be easy to use and allow you to submit the necessary information for the platform to consider your complaint. You must be able to submit your complaint electronically and free of charge. You have 6 months from the date of the decision to lodge a complaint with the platform provider.
The platform must handle the complaint in a diligent, non-discriminatory way. The platform must reverse its decision quickly if your complaint shows that your content/conduct is not illegal or does not breach the terms and conditions. A platform must have appropriately qualified staff supervising this decision and the process cannot be fully automated.
A platform is required to notify you of its decision in relation to your complaint and its reasons for it without undue delay. The provider must also advise you of the options for redress, in the event you are not satisfied with the decision.
If you believe an online platform provider has not met its obligations in relation to its internal complaint handling system, you can contact us to make a complaint.
The terms and conditions of a platform must explain the rules for using its service. Platforms must clearly explain their policy for suspending or limiting the account of a user. They must give examples of what they consider when assessing behaviour and they must also be clear about how long a suspension will last. If you think a platform has not met its obligations in relation to its terms and conditions, you can contact us to make a complaint.
Platforms must suspend users who frequently submit reports or complaints/appeals that are considered manifestly unfounded. Manifestly unfounded means that it is clear, without any in-depth analysis, that the report or complaint is unfounded. Before you are suspended from making complaints, the platform must give you a warning, and the suspension must be for a reasonable period of time.
When deciding on suspension, the platform must assess, on a case-by-case basis, the circumstances based on the information it has. These circumstances include:
– The absolute numbers of manifestly unfounded reports or complaints, submitted within a given time period
– The number of such unfounded complaints relative to total number of content posted or reports submitted within a given time period
– The gravity of the misuses, including the nature of illegal content, and of its consequences
– Where it is possible to identify it, the intention of the user, the individual, the entity or the complainant
An online platform’s terms and conditions must explain their policy for suspending certain users for misuse and provide examples of the facts and circumstances that they consider in deciding to suspend a user.
If you believe an online platform provider has not met its obligations in relation to suspending you from submitting reports or complaints, you can make a complaint to us.
Online platforms must suspend users who frequently provide manifestly illegal content. Before you are suspended the platform must give you a warning, and the suspension must be for a reasonable period of time.
When deciding on suspension the platform must assess, on a case-by-case basis, the circumstances based on the information it has. These circumstances include:
– The absolute numbers of manifestly illegal content, submitted within a given time period
– The number of such content relative to the total number of content posted or reports submitted within a given time period
– The gravity of the misuses, including the nature of illegal content, and of its consequences
– Where it is possible to identify it, the intention of the user, the individual, the entity or the complainant
If you believe an online platform provider has not met its obligations in relation to suspending you for the content you posted, you can make a complaint to us
Issues relating to advertising
The Digital Services Act aims to build a more transparent, safer and fairer online world. Advertisements, whether they be political or advertising goods and services, must be clearly labelled. It should also be clear who is advertising and who paid for the ad. If you notice that ads are not clearly labelled on platforms, you can contact us.
You must be able to view the reasons you were chosen to be shown an ad and you must be given information on how you can change these settings. An ad must not be shown to you based on certain personal data, including race, ethnic origin, political opinion, membership of a trade union, religious or philosophical beliefs, health data, your sex life or sexual orientation.
If you believe a platform has not met its obligations in relation to advertising, you can contact us to make a complaint.
A platform intended for use by children must put measures in place to ensure a high level of privacy, safety and security. Ads on these services must not target children based on profiling, using the data they have about the child. If you believe a platform has not met its obligations in relation to advertising to children, you can contact us to make a complaint.
Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) must provide information about ads, in a public section of their online interface (an advertising repository).
The following information must be included:
– the content of the ad
– who is advertising and who paid for the ad (if different).
– the timeframe for the ad campaign
– what targeting was used
– the commercial communications published on VLOPs
– the total number of people reached and, where applicable, broken down by Member State
This information must be publicly accessible and available for the period the ad is running and also for one year after the ad has run. Providers of VLOPs and VLOSEs must also ensure that the repository does not contain any personal data of those who were shown the ad.
If you believe a VLOP/VLOSE has not met its obligations in relation to advertising, you can contact us to make a complaint. Complaints in relation to these matters may be passed on to the European Commission for consideration and decision.
Recommender systems/’For you’ feeds
A platform that uses recommender systems must explain how content is being selected in its terms and conditions. The explanation must include the main parameters used in their recommender systems, any option for you to change the parameters, the most important criteria used and why these criteria are important. Some platforms have options of recommender systems. Those online platforms must allow you to choose your preferred recommender system.
In addition to the above obligations, Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) that use recommender systems must provide at least one option which is not based on profiling using the data they have about you.
If you believe a platform has not met its obligations in relation recommender systems, you can make a complaint to us. Complaints relating to VLOPs or VLOSEs may be passed on to the European Commission for consideration and decision.
Certain protections for children
The Online Safety Code contains rules on protecting the public from harm online. It applies to certain video-sharing platforms based in Ireland. Video-sharing platforms must include and apply in their Terms and Conditions certain protections to children from video content, including video ads, which may harm them. Video-sharing platforms must also provide for parental control systems which are under the control of the user.
You can contact us with information about harmful video content or video advertising. We can use this information to better understand how video-sharing platforms are complying with their obligations under the Online Safety Framework.
What you can’t report to Coimisiún na Meán
The Online Safety Framework enables us to investigate systemic risks on platforms and make sure that platforms have processes and systems in place to reduce the illegal and harmful content that the public, and especially children, may see. We have a Contact Centre that can handle queries about our role, but at times there can be misconceptions about what we can do and what the public is able to report to us.
Here are some examples of situations whereby it is not within our remit to take action:
As a media regulator, our role is to make sure the online platforms that we regulate are following the rules under the Online Safety Framework. We are not a content moderator. We are not An Garda Síochána. We are not a judge. We do not have the powers to immediately remove illegal or harmful content from the internet. We need to understand how effective reporting functions are on the platforms we regulate. For example, if you report illegal content and you don’t get a reply in a timely manner and this content is not removed, or the platform doesn’t tell you about that decision to not remove it, you can contact us. Our Advice Centre will thank you for your report, give you advice and will record your concerns. We are not able to remove this content for you. One of the options available to us is to pass this information on to our Platform Supervision team who will work to ensure that the platforms improve their systems.
If you fear for your immediate safety, your first port of call is always to contact An Garda Síochána on 999 or 112. Depending on the severity, sending threatening and offensive messages may be illegal as it may be considered harassment, so it is important to report to An Garda Síochána. If you have reported the illegal content in these posts to the platform and the platform hasn’t removed the content, or given you a way to appeal this, you can contact us.
If you are in Ireland or in any EU country and you or someone else, are in immediate danger or risk of harm right now, please phone the emergency services by calling 112 from any phone. This number is free of charge.
If you want to report a crime or suspected criminal activity, contact An Garda Síochána, either at your local Garda station or by using the free Garda Confidential telephone 1800 666 111.
If you suspect a child is a victim of abuse or is at risk of abuse, or if you have concerns about images stored on private devices or content (i.e. images) being shared via encrypted private communications such as messaging apps, you can contact your local Garda station or the free Garda Confidential line 1800 666 111.
No, we are not the competent authority to look into this. If you contact us, we will direct you to the Competition and Consumer Protection Commission (CCPC). Their helpline is 01-4025555.