The National Authority for Management and Regulation in Communications of Romania (ANCOM) draws the attention of digital services users to the difference between harmful content and illegal content in the online environment and explains how such content can be reported to digital platforms. Taking into account the increasing volume of information being shared online, it is important for users to understand this difference, so they know who to turn to and what steps to take when encountering problematic content.
More information on harmful content is available in the material published by ANCOM here.
What is harmful content?
Harmful online content refers to material published or distributed on the internet that, while not breaking the law, may negatively affect users’ safety, health, emotional wellbeing, dignity or development. This category includes, for example: materials that promotes risky behaviour, content that is offensive, manipulative or that can negatively influence certain vulnerable categories.
Even if it is not illegal, the harmful content may have significant social and psychological effects. For this reason, online platforms set their own internal rules for managing it – terms and conditions, community standard or moderation policies.
Harmful Content vs. Illegal Content Online
It is important to correctly identify the type of content because the reporting procedure differs. The main differences are:
- harmful content does not violate applicable legislation and is managed exclusively through the mechanisms provided by online platforms.
- illegal content violates a legal provision (criminal, civil administrative, etc.) and may be removed pursuant to the mechanisms set forth in the Digital Services Act (DSA) and orders issued by the competent authorities in the relevant field (the relevant authorities).
Obligations of online platforms
Under the Digital Services Act, providers of intermediary services, including online platforms, have several obligations:
- provide users with ways to report illegal content,
- include in their general “Terms of Service” information regarding any restrictions they impose on the use of their service by users.
Additionally, when restricting content or accounts, platforms must:
- explain the reasons for that decision;
- provide users with the opportunity to appeal the decision by implementing an effective internal complaint-handling system.
In addition, very large online platforms and very large online search engines are required to assess and mitigate systemic risks, including the ones that may affect public health, the well-being of minors, civic speech and electoral processes, public safety, the physical and mental well-being of individuals or fundamental rights such as the right to dignity, the right to private life and family, children’s rights, and freedom of expression and information.
Handling complaints on harmful content
ANCOM makes the following recommendations to users of online platforms:
- to read the platform’s “Terms of Service” including the rules regarding prohibited content;
- if they notice such content, to identify the category of harmful content under which it may be classified. Proper framing of content can ensure accurate and effective content assessment;
- report directly to the platform according to the mechanisms made available by the platform (e.g. by using the ‘Report/Flag’ buttons).
In order to support users, the Authority has produced several guidelines for reporting content to platforms, available, in Romanian, here.
The role of ANCOM
ANCOM, as the Digital Services Coordinator in Romania, does not determine whether content is illegal or harmful and cannot dispose its removal.
On the other hand, if the platform has not provided a statement of reasons following the restriction of content or the opportunity to challenge such a decision, users may file a complaint with ANCOM using the dedicated online form to report a possible violation of the DSA provisions by intermediary service providers.
Important!
The authority can verify compliance with the obligations only for intermediary service providers established in Romania. In the case of very large online platforms/search engines (such as Facebook, Instagram, TikTok, YouTube, etc.), oversight is carried out jointly by the European Commission and the Digital Services Coordinator in the country where the platform or search engine has its headquarters or designated legal representative. ANCOM will forward complaints regarding violations of the Regulation by such platforms to the aforementioned entities.
