Commission: In-depth investigation on TikTok services to protect minors

© EUROPEAN UNION 2023 - SOURCE : EP-143945A PHOTOGRAPHER: ERIC VIDAL
"The safety and well-being of online users in Europe is crucial. TikTok needs to take a close look at the services they offer and carefully consider the risks that they pose to their users - young as well as old," said Margrethe Vestager, Executive Vice-President for a Europe Fit for the Digital Age.

The European Commission has opened formal proceedings to assess whether TikTok may have breached the Digital Services Act (DSA) in areas linked to the protection of minors, advertising transparency, data access for researchers, and the risk management of addictive design and harmful content.

“The safety and well-being of online users in Europe is crucial. TikTok needs to take a close look at the services they offer and carefully consider the risks that they pose to their users – young as well as old. The Commission will now carry out an in-depth investigation without prejudice to the outcome,” Margrethe Vestager, Executive Vice-President for a Europe Fit for the Digital Age, said on February 19.

TikTok was designated as a Very Large Online Platform (VLOP) on April 25 2023, under the EU’s Digital Services Act, following its declaration of having 135.9 million monthly active users in the EU. As a VLOP, four months from its designation, TikTok had to start complying with a series of obligations set out in the DSA.

Since February 17, the Digital Services Act applies to all online intermediaries in the EU.

“The protection of minors is a top enforcement priority for the DSA. As a platform that reaches millions of children and teenagers, TikTok must fully comply with the DSA and has a particular role to play in the protection of minors online. We are launching this formal infringement proceeding today to ensure that proportionate action is taken to protect the physical and emotional well-being of young Europeans. We must spare no effort to protect our children,” stressed Thierry Breton, Commissioner for Internal Market.

Based on the preliminary investigation conducted so far, including based on an analysis of the risk assessment report sent by TikTok in September 2023, as well as TikTok’s replies to the Commission’s formal Requests for Information (on illegal contentprotection of minors, and data access), the Commission has decided to open formal proceedings against TikTok under the Digital Services Act.

The proceedings will cover the following areas.

  1. Compliance with the DSA obligations related to assessing and mitigating systemic risks associated with TikTok’s system design. This includes algorithmic systems that may encourage addictive behaviour and lead to so-called ‘rabbit hole effects’. Such an assessment is necessary to counter potential risks to the fundamental right to physical and mental well-being, child rights, and impact on radicalization processes. The investigation will also assess the effectiveness of TikTok’s age verification tools for preventing minors’ access to inappropriate content.
  2. Compliance with DSA obligations to implement appropriate and proportionate measures to ensure a high level of privacy, safety, and security for minors, particularly default privacy settings for minors, as part of the design and functioning of their recommender systems.
  3. Compliance with DSA obligations to provide a searchable and reliable repository for advertisements presented on TikTok.
  4. Measures taken by TikTok to increase transparency on its platform. The investigation will also look into suspected shortcomings in providing researchers access to TikTok’s publicly accessible data as Article 40 of the DSA mandates.If proven, these failures would constitute infringements of Articles 34(1), 34(2), 35(1) 28(1), 39(1), and 40(12) of the DSA. The Commission will now carry out an in-depth investigation as a matter of priority. The opening of formal proceedings does not prejudge its outcome.

The current proceedings being initiated do not affect any other proceedings that the Commission may initiate in the future against any other conduct that may constitute an infringement under the DSA. For instance, these proceedings will not affect any provider’s obligations regarding disseminating illegal content, such as terrorist content or child sexual abuse online, or their notification obligations in case of suspicion of any criminal offences.

It is also without prejudice to enforcement actions undertaken by other authorities under other regulatory frameworks, such as the Consumer Protection Cooperation Network.

The Commission is empowered to take further enforcement steps.

After initiating formal proceedings, the Commission will gather evidence and take further enforcement measures, such as interim measures and non-compliance decisions. The DSA does not set a legal deadline for concluding formal proceedings, which depends on various factors, including the company’s cooperation. Formal proceedings relieve Digital Services Coordinators of their powers to enforce DSA regarding the suspected infringements of Article 28(1).

Explore more