Executive Summary
What’s new: The European Data Protection Board issued guidelines for navigating the interplay between the DSA and GDPR.
Why it matters: The guidelines cover:
Identifying illegal content and handling illegal content notices submitted by users.
Personalised advertisements.
Recommendation algorithms, protection of minors and risk assessments.
What to do next: Companies should consider benchmarking their existing DSA compliance programmes against the EDPB’s guidance to ensure that these programmes are consistent with the company’s GDPR documentation and tailored to withstand both DSA and GDPR regulator scrutiny. Companies may also want to comment before the consultation on the guidelines expires on 31 October 2025.
__________
Context
On 12 September 2025, the European Data Protection Board (EDPB) issued draft guidelines (Guidance) on the interplay between the EU General Data Protection Regulation (GDPR) and the Digital Services Act (DSA), the latter of which regulates online intermediaries and platforms (e.g., marketplaces and social networks). The Guidance is open for consultation until 31 October 2025.
The Guidance notes that many actions companies take to meet their DSA obligations will involve processing personal data and therefore require companies to comply with the GDPR. This can create a challenging compliance picture for companies because DSA and GDPR obligations both overlap and diverge, creating tension in some areas. Below, we summarise the key points from the Guidance.
Investigating Illegal Content and Implementing Notice Mechanisms
The DSA, like the EU’s earlier e-commerce directive, includes a “safe harbour” under which providers of “mere conduit,” caching and hosting services are (subject to certain conditions), are not liable for the content transmitted using their services. Article 7 of the DSA clarifies that these service providers will not lose the benefit of the “safe harbour” just because they carry out voluntary investigations (e.g., scans) aimed at detecting illegal content on their platforms. The DSA also requires hosting services to implement “notice and action” mechanisms that allow users to report potentially illegal content on the platform.
The Guidance notes that, when implementing these obligations, companies will inevitably use some amount of personal data (e.g., personal data about the user who posted the illegal content, or personal data about the person who submitted the illegal content notice) and therefore need to comply with the GDPR when doing so. The Guidance reminds companies implementing DSA obligations that:
Regarding legal basis for processing: Because Article 7 investigations are voluntary, companies generally cannot rely on the “necessary to comply with a legal obligation” legal basis under the GDPR to conduct these searches, and will instead need to rely on the “legitimate interests” legal basis — and document this through a “legitimate interest assessment.”
Regarding automated decisions: Using automated tools to investigate and remove illegal content1 may amount to an “automated decision” under the GDPR, triggering obligations (i) that prohibit the use of special category data, (ii) to provide information about the decision-making process and (iii) to implement human intervention and other “measures to safeguard … rights and freedoms.” These obligations overlap with, and stand in addition to, the DSA’s obligations to describe and provide human oversight of automated decision-making.2
Regarding data minimisation: Companies should limit the processing of personal data to what is necessary for compliance with relevant DSA provisions. For example, the “notice and action” mechanism should not require the person submitting the illegal content notification to provide any personal data other than their name and email address.
Regarding transparency: In their privacy notices, companies must meet their GDPR obligations to describe the data processing they undertake to detect illegal content. This obligation applies in addition to the DSA’s obligations to describe content moderation measures — including any “algorithmic decision-making” in a company’s terms of use.3
Processing of Personal Data in Advertising
The Guidance reminds companies that the transparency requirements outlined in DSA Article 26 (which include providing “meaningful information directly … about the main parameters used to determine the recipient to whom the advertisement is presented”) must be implemented alongside the intersecting information required to be included in GDPR consent requests and privacy notices. Companies will want to pay particular attention to ensure that these overlapping transparency disclosures are consistent with each other, and remain consistent over time as the company’s advertising practices change.
Recommender Systems, Dark Patterns, Protection of Minors, Risk Management
Finally, in relation to the DSA’s obligations regarding recommender systems, the protection of minors and systemic risk assessments, the EDPB notes that the GDPR imposes analogous obligations to each of those, such as obligations regarding automated decision-making,4 data protection impact assessments and the heightened standard applied to children’s data. Again, companies should check that their DSA compliance documentation in these areas complements and builds on existing GDPR documentation covering similar topics in order to ensure that the entire suite of documentation presents a consistent picture to regulators.
Actions To Take
Companies that fall within the scope of the DSA should consider benchmarking their existing DSA compliance documentation against the EDPB’s guidelines, and reviewing their DSA compliance documentation to ensure it complements and is consistent with their GDPR compliance documentation.
Companies should also consider commenting on the EDPB’s public consultation on the draft guidelines — though in practice the EDPB is typically unreceptive to comments from industry.
Trainee solicitor Harry Reeves contributed to this article.
_______________
1 For example, Article 16(6) DSA envisages the use of automated means for processing illegal content notifications.
2 For example, under Articles 15, 17(3)(c), and 20(6) DSA.
3 Article 14(1) DSA.
4 Although not discussed at length in the Guidance, the EDPB’s suggestion that recommender systems may be “automated” decisions” for the purposes of GDPR Article 22 represents an expansive interpretation of Article 22. While that expansive interpretation is not necessarily surprising in light of the European Court of Justice’s similarly expansive interpretation (e.g., in the SCHUFA Holding (Scoring) case C-634/221) it is a reminder that GDPR regulators view a wide range of IT systems as falling within the scope of Article 22.
This memorandum is provided by Skadden, Arps, Slate, Meagher & Flom LLP and its affiliates for educational and informational purposes only and is not intended and should not be construed as legal advice. This memorandum is considered advertising under applicable state laws.

For further information, please contact:
Nicola Kerr-Shaw, Partner, Skadden
nicola.kerr-shaw@skadden.com

 
			


