Today, the European Commission preliminarily found both TikTok and Meta in breach of their obligation to grant researchers adequate access to public data under the Digital Services Act (DSA) . The Commission also preliminarily found Meta, for both Instagram and Facebook, in breach of its obligations to provide users simple mechanisms to notify illegal content, as well as to allow them to effectively challenge content moderation decisions.
Data access for researchers
The Commission's preliminary findings show that Facebook, Instagram and TikTok may have put in place burdensome procedures and tools for researchers to request access to public data. This often leaves them with partial or unreliable data, impacting their ability to conduct research, such as whether users, including minors, are exposed to illegal or harmful content.
Allowing researchers access to platforms' data is an essential transparency obligation under the DSA, as it provides public scrutiny into the potential impact of platforms on our physical and mental health.
Notice and Action mechanisms
When it comes to Meta, neither Facebook nor Instagram appear to provide a user-friendly and easily accessible 'Notice and Action' mechanism for users to flag illegal content, such as child sexual abuse material and terrorist content. The mechanisms that Meta currently applies seems to impose several unnecessary steps and additional demands on users. In addition, both Facebook and Instagram appear to use so-called 'dark patterns', or deceptive interface designs, when it comes to the 'Notice and Action' mechanisms.
Such practices can be confusing and dissuading. Meta's mechanisms to flag and remove illegal content may therefore be ineffective. Under the DSA, 'Notice and Action' mechanisms are key to allowing EU users and trusted flaggers to inform online platforms that certain content does not comply with EU or national laws. Online platforms do not benefit from the DSA's liability exemption in cases where they have not acted expeditiously after being made aware of the presence of illegal content on their services.
Content moderation appeals
The DSA also gives users in the EU the right to challenge content moderation decisions when platforms remove their content or suspend their accounts. At this stage, the decision appeal mechanisms of both Facebook and Instagram does not appear to allow users to provide explanations or supporting evidence to substantiate their appeals. This makes it difficult for users in the EU to further explain why they disagree with Meta's content decision, limiting the effectiveness of the appeals mechanism.
The Commission's views related to Meta's reporting tool, dark patterns and complaint mechanism are based on an in-depth investigation, including co-operation with Coimisiún na Meán, the Irish Digital Services Coordinator.
These are preliminary findings which do not prejudge the outcome of the investigation.
Next steps
Facebook, Instagram and TikTok now have the possibility to examine the documents in the Commission's investigation files and reply in writing to the Commission's preliminary findings. The platforms can take measures to remedy the breaches. In parallel, the European Board for Digital Services will be consulted.
If the Commission's views are ultimately confirmed, the Commission may issue a non-compliance decision, which can trigger a fine of up to 6% of the total worldwide annual turnover of the provider. The Commission can also impose periodic penalty payments to compel a platform to comply.
New possibilities for researchers will open up on 29 October 2025, as the delegated act on data access comes into force. This act will grant access to non-public data from very large online platforms and search engines, aiming to enhance their accountability and identify potential risks arising from their activities.
Background
The Commission's preliminary findings are part of the Commission's formal proceedings launched into Meta , and formal proceedings to investigate TikTok , under the DSA. The Commission continues its investigation into other potential breaches that are part of these ongoing proceedings. These formal proceedings under the DSA are distinct from ongoing investigations against Facebook, Instagram and TikTok concerning compliance with other relevant EU law.