top of page
  • Writer's pictureBergs&More

Digital Service Act: points of contact with GDPR



What is the Digital Service Act?

The Regulation UE 2022/2065 of 19 October 2022 (“Regulation”) on a Single Market For Digital Services, better known as Digital Service Act (“DSA”) – which amends the Directive 2000/31/EC (regulation on digital services) – has been defined as the new digital constitution of European Union. A uniform framework applicable to intermediary services in the internal market with the aim of ensuring a safe, predictable and trusted online environment, where fundamental human rights are effectively protected and the innovation and growth of SMEs is facilitated, while at the same time counteracting the dissemination of illegal content online and the societal risks that the dissemination of disinformation or other content may generate.

 

The provisions introduced by the DSA will: (i) make easier to report illegal content, providing enhanced protection from dangerous goods and illegal content through mandatory notice systems; (ii) provide greater protection for victims of online harassment and bullying (even ensuring that any private images shared in a non-consensual manner can be reported promptly by users); (iii) restrict targeted advertising and require the publication of terms and conditions in a simplified form.

 

When does it entry into force?

The DSA entered into force on 16th November 2022, and it applies from the last 17 February 2024, although some of the obligations on online platforms and online search engines designated by the Commission as 'very large' (with an average monthly number of active recipients of the service in the EU of 45 million or more) were already applied at the date of entry into force.

 

To whom does DSA apply?

The DSA applies to intermediary services offered online to recipients established or located in the EU, irrespective of where the providers of those intermediary services have their place of establishment.

Intermediary services are defined as:

o   non-hosting activities, which include mere conduit and/or caching services;

o   hosting activities (information storage services).

This is an extremely broad definition that determines the applicability of the DSA to different categories of service providers including, as an example, marketplaces, social platforms, content sharing platforms, application stores and travel-related platforms.

 

Which are the main obligations?

The DSA imposes four levels of strict, cumulative and increasing obligations on intermediaries, corresponding to their role and size. The Regulation distinguishes between common minimum obligations, applicable to all intermediaries, and additional and more stringent obligations, differentiated according to the type and size of the intermediary.

The common minimum obligations, regulated in Articles 11-15 of the DSA, are based on the principle of due diligence, and provide for:

o   the obligation to establish a single point of contact for Authorities and users to enable direct and rapid communication;

o   the obligation to cooperate with the competent Authorities to ensure compliance with the obligations arising from the Regulation;

o   the obligation to formulate the 'Terms and conditions of use' of the services in a clear, comprehensible manner and using an accessible format, including clear information on the restrictions imposed on the use of the services, the tools used for the purpose of content moderation, such as the algorithmic decision-making process adopted and the complaints procedures;

o   an obligation to prepare and make publicly available annual reports on content moderation activities engaged during the relevant period.

 

The second level of obligations concerns hosting services and requires: (i) the adoption of mechanisms for notifying illegal content; and (ii) the notification of authorities of potential criminal offences.

In addition, the DSA imposes further obligations on online platforms, even small ones, such as: (i) the adoption of mechanisms for reporting illegal content; (ii) the introduction of complaint mechanisms; and (iii) transparency obligations for advertisements presented on their platforms.

Finally, at the top of the pyramid there are the most stringent obligations, reserved for large platforms and search engines, which are required to implement a risk management and mitigation system.

Failure to comply with these obligations exposes non-compliant providers of intermediary services to specific liabilities, which may result in the imposition of related penalties (which may the 6% of the intermediary service provider's annual worldwide turnover).

 

How does the DSA interact with the GDPR?

The DSA fits into the complex fabric of the Union's digital legislation with cross-cutting effects on different sources, such as the EU Data Protection Regulation 2016/679 ('GDPR').


The DSA expressly recalls certain provisions of the GDPR, specifying that, in areas of actual overlap, the latter prevails as lex specialis. Consequently, intermediary service providers will have to comply with both the provisions of the DSA and the GDPR in the context of activities involving the processing of personal data.


In particular, the interaction between DSA and GDPR concerns the following aspects o   in the event that an online platform uses an automated system (e.g. an algorithm) to identify and moderate illegal content or to handle complaints, it must comply with:

(i)     the limits imposed by Article 22 of the GDPR, which prohibits the use of solely automated systems that may affect the data subject, such as the removal of a content. Therefore, within these limits, even the handling of complaints cannot be entirely automated, but it shall be done through human supervision with the use of appropriately qualified and trained personnel;

(ii)   the rules laid down in the GDPR, if removal of the content is required because it violates the right to be forgotten;

(iii)  the transparency obligations provided for by the GDPR, and set out by the DSA, which require that the terms and conditions include clear and comprehensible information on the tools used for content moderation.

o   profiling activities aimed at recommending contents or information to users; in that case the online platform shall:

(i)     design the platform's interface allowing users to make autonomous choices and, according to the principle of privacy by default provided for by the GDPR, disable by default the recommendation of content based on profiling, unless explicit consent of the recipient;

(ii)   respect the limits set by Article 22 of the GDPR, since profiling activities may have a significant impact on the data subject, influencing his or her choices and, in any case, require his or her consent;

(iii)  comply with the obligation of transparency, provided by both the GDPR and the DSA. In addition, the DSA intervenes by imposing additional transparency obligations on the intermediary that presents advertisements on its online interfaces, in order to ensure that recipients can clearly identify, in the first place, that it is an advertisement, by whom it is presented and the parameters used to determine the recipient of the advertisement;

(iv)  allow the recipient to select the option of his or her preference, to configure the order of the recommended information. In this way, the DSA aligns with the GDPR, allowing the user to make the final decision;

(v)   comply with the prohibition in Article 26(3) of the DSA, which prohibits marketing activities based on the profiling of special personal data, as defined in Article 9 of the GDPR.


o   The protection of minors. The DSA, like the GDPR, aims to protect minors, who are considered as a category in need of enhanced protections for targeted marketing activities and as possible victims of online abuse. In order to access an online service, the GDPR requires consent to the processing of personal data, which may be given directly by the child, if the child has reached the age of 16 (in Italy, pursuant to art. 2-quinquies of Legislative Decree 196/2003, 14 years of age), otherwise the consent is expressed by the parents. The DSA aims to strengthen protection for these categories, for example providing that if the service is intended for or used predominantly by minors, the intermediary will have to provide clear explanations, expressly addressed to minors, on the conditions and restrictions applicable to the service, as well as to provide for age verification mechanisms. In this regard, it is already possible to see the effects produced by the DSA. In fact, on 19 February the European Commission opened formal proceedings against TikTok given the suspected breach of DSA provisions, including those provisions relating to the protection of minors. 

 


Authors:   Consuelo Leonardi – Beatrice Olivo

Contact: Avv. Eduardo Guarente e.guarente@bergsmore.com


bottom of page