Pursuant to DSA, providers of intermediary services have the following main obligations:
✅ to designate:
- a contact point for recipients of the services (Article 12 of the DSA)
- a contact point for communication with competent authorities (Article 11 of the DSA)
- a representative in the EU and shall notify the name, postal address, email address and telephone number of their legal representative to the Digital Services Coordinator in the Member State where that legal representative resides or is established (Article 13 of the DSA)
✅ to include in their general terms and conditions (Article 14 of the DSA):
- any restrictions they apply to the use of the service and the content of users
- their policies, procedures, measures and tools for content moderation, including:
- the use of automated systems (algorithms)
- human control over decisions
- the functioning of the internal complaints handling system
The information in the general terms and conditions must be:
- presented in plain and unambiguous language
- publicly accessible
- provided in an easily accessible and machine-readable format
Providers must:
- inform users of any significant changes to the general terms and conditions
- where the service is primarily intended for children, the terms and conditions and restrictions must be explained in a language appropriate to their age
When applying their rules, providers must act:
- in good faith, objectively and proportionately
- with due respect for the fundamental rights of all parties concerned – including freedom of expression and media pluralism, as guaranteed by the Charter of Fundamental Rights of the European Union
✅ publish at least once a year reports on content moderation (Art. 15 DSA):
The reports, which shall be public, clearly structured, easily accessible and in a machine-readable format, shall contain information on:
- Orders from authorities
- the number of orders received to remove illegal content
- orders to provide information
- the Member State issuing the order
- the type of content
- the time taken to implement it
- Self-initiated moderation
- use of automated tools;
- moderation measures taken;
- number and type of restrictions affecting:
- accessibility and visibility of content
- ability of users to post information
- categorization according to the type of violation and detection methods used
- Complaints from users
- number of complaints received through internal systems, in accordance with their general terms and conditions
- Automated systems
- description of the automated tools used
- their objectives
- accuracy indicators
- percentage of possible errors
- safeguards against unlawful decisions applied
