Today, the European Commission has made a preliminary finding that both TikTok and Meta have failed to fulfil their obligations to provide adequate access to public data for researchers, as stipulated by the Digital Services Act (DSA).
Furthermore, the Commission has preliminarily concluded that Meta, which operates the platforms Instagram and Facebook, violates its responsibilities to offer users clear and effective mechanisms for reporting illegal content and contesting content-moderation decisions.
“Our democracies depend on trust. That means platforms must empower users, respect their rights, and open their systems to scrutiny. The DSA makes this a duty, not a choice. With today’s actions, we have now issued preliminary findings on researchers’ access to data to four platforms. We are making sure platforms are accountable for their services, as ensured by EU law, towards users and society,” highlighted Henna Virkkunen, Executive Vice-President for Tech Sovereignty, Security and Democracy.
The Commission’s preliminary findings suggest that Facebook, Instagram, and TikTok may have implemented complex procedures and tools that hinder researchers’ ability to access public data. This situation frequently results in researchers obtaining partial or unreliable data, which adversely affects their capacity to assess whether users, including minors, are exposed to illegal or harmful content.
Granting researchers access to platform data is a vital transparency obligation under the DSA, as it enables public scrutiny of the potential impact of these platforms on both physical and mental health.
Regarding Meta, it appears that neither Facebook nor Instagram offers a user-friendly or easily accessible “Notice and Action” mechanism for users to report illegal content, such as child sexual abuse material and terrorist content. The mechanisms Meta currently employs impose unnecessary steps and additional demands on users.
Additionally, both platforms appear to employ “dark patterns,” which are misleading interface designs within the “Notice and Action” process. Such practices can lead to confusion and discourage users from reporting, thereby diminishing the effectiveness of Meta’s mechanisms for flagging and removing illegal content.
Under the DSA, adequate “Notice and Action” mechanisms are essential for enabling EU users and trusted flaggers to notify online platforms when certain content does not comply with EU or national laws. It is important to note that online platforms do not benefit from the DSA’s liability exemption if they fail to act promptly upon being informed of illegal content on their services.
The Digital Services Act provides users within the European Union with the right to challenge content moderation decisions made by platforms, particularly when their content is removed or their accounts are suspended. Currently, the appeal processes available on both Facebook and Instagram do not permit users to submit explanations or supporting evidence to substantiate their appeals. This limitation complicates EU users’ ability to convey their disagreements with Meta’s content decisions effectively and impacts the overall efficacy of the appeals mechanism.
Furthermore, the Commission’s observations regarding Meta’s reporting tool, the presence of dark patterns, and the complaint mechanism are based on a comprehensive investigation conducted in cooperation with Coimisiún na Meán, the Irish Digital Services Coordinator. It is essential to recognise that these findings are preliminary and do not preclude the outcome of the investigation.
The companies Facebook, Instagram, and TikTok now have the opportunity to examine the Commission’s investigation documents and respond to its preliminary findings. They may take necessary measures to rectify any identified violations.
Additionally, the European Board for Digital Services will be consulted throughout this process. Should the Commission’s conclusions remain unchanged, it may issue a non-compliance decision that could result in fines of up to 6% of the provider’s total annual revenue, along with ongoing penalties to ensure adherence to regulations.
Furthermore, on 29 October, new opportunities will be available for researchers as the delegated act on data access is implemented. This act will facilitate access to non-public data from major online platforms and search engines, thereby enhancing accountability and enabling the identification of potential risks.
