The Digital Services Act (DSA) encourages codes of conduct to be developed with the purpose of “tackling different types of illegal content and systemic risks”, or for the specific purpose of online advertising or accessibility. This approach follows in the footsteps of the General Data Protection Regulation (GDPR). So what can we learn from those codes?By Carl Vander Maelen
The European Union has debuted a string of legislative instruments to regulate different aspects of the information and communications technology (ICT) sector. The General Data Protection Regulation (GDPR) entered into application in 2018 and renewed the EU’s data protection framework; the proposal for an Artificial Intelligence Act (AIA) seeks to mitigate the risks posed by AI technologies; and the Digital Services Act (DSA) regulates the content offered on digital platforms.
While diverse in goals, scholars have noted that recent instruments seem to follow the template laid out by the GDPR, a phenomenon dubbed ‘GDPR mimesis’. A striking example thereof is how articles 45-47 DSA encourage the development of codes of conduct, with several elements being reminiscent of articles 40-41 GDPR and its own call for codes.
How similar is the DSA’s approach to codes of conduct to the GDPR’s approach? And what lessons can be learned from the GDPR’s successes and failures? Two elements are immediately pertinent.
Tension between soft and hard approachesFirst, the interactions between codes of conduct and the instrument they are embedded in merit discussion. The GDPR clearly situates codes as secondary instruments vis-à-vis the GDPR as the primary instrument. After all, codes are “intended to contribute to the proper application” of the GDPR (article 40.1 GDPR) and have “the purpose of specifying” its application (article 40.2 GDPR). To that end:
- codes may be “used as an element by which to demonstrate compliance” as found in:
- Article 24.3 GDPR (obligations of the controller)
- Article 28.5 GDPR (sufficient guarantees by processors to implement appropriate technical and organisational measures)
- Article 32.3 GDPR (level of security appropriate to the risk)
- “[c]ompliance with approved codes […] shall be taken into due account in assessing the impact of the processing operations” for a DPIA (article 35.8 GDPR)
- “[w]hen deciding whether to impose an administrative fine and deciding on the amount of the administrative fine […] due regard shall be given to” adherence to codes (article 83.2.j GDPR)
This tension between a soft and a hard approach also runs throughout the DSA and its codes. On the one hand, the DSA and the Commission go to great lengths to assure the voluntary, self-regulatory nature of codes under the DSA. See the mentions of codes being “voluntary” tools in Recitals 98, and articles 45.1 and 46.1 DSA, or in point ‘h’ of the Preamble to the 2022 Strengthened Code of Practice on Disinformation (for more on this code, see below). On the other hand, the DSA takes a very clear top-down approach. Recital 104 notes that “[t]he refusal without proper explanations […] of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account” when determining whether there was an infringement of the DSA. Recital 103 DSA contains this tension within one provision, speaking of codes’ voluntary nature and parties’ freedom to decide whether to participate – while also stressing the importance of cooperating in developing and adhering to specific codes. Such an intertwinement of soft and hard approaches “questions the extent to which a platform could abandon the commitments it has voluntarily made”.
Uptake and experimental temporalitySecond, the reality of code development must be addressed. The GDPR provides a sobering view with only two approved transnational codes, almost five years after its entry into application, and not a single approved ‘code having general validity’ for data transfers to third countries (article 40.3 juncto 46.2.e GDPR). While there are a number of codes adopted at the national level, the strict requirements for monitoring laid down by the GDPR and the European Data Protection Board (EDPB) are noted as particular sore points that see code developers walk away from the development halfway through or near the end of the process.
In contrast, the DSA has not yet entered into application but (potential) codes already exist. The ‘Code of Practice on Disinformation’ was originally unveiled in 2018, and later updated to the ‘2022 Strengthened Code of Practice on Disinformation’. It explicitly states in point i of its preamble that it “aims to become a Code of Conduct under Article 35 [ed: article 45 in the final text] of the DSA, after entry into force”. The 2016 ‘Code of Conduct on Countering Illegal Hate Speech Online’ similarly treats topics relevant to the DSA, long before the DSA’s final text was approved. The European Commission reported at the end of 2022 that it “will discuss with the IT companies how to ensure that the implementation of the Code supports compliance with the DSA […]. This process may lead to a revision”.
This seems to imply, then, a similar trajectory whereby the Code may be revised and slotted into the DSA’s framework. At the time of writing, work is also underway on the ‘EU Code of conduct on age-appropriate design’. Although its drafting and monitoring process seems to follow a slightly different approach (due to the establishment of a specific expert group), the European Commission similarly mentions that the code “will build on the regulatory framework provided in the [DSA] and assist with its implementation”.
A recipe for success?The availability of codes of conduct under the DSA therefore seems guaranteed, although some questions could be raised about the transparency of the process and the temporal logic of their development. These concerns go beyond theory. Ex-post assessments of the 2018 ‘Code of Practice on Disinformation’ recommended that there should be “a shift from the current flexible self-regulatory approach to a more co-regulatory one”, which was realised in 2022. Remarkably, however, stakeholders already complained that the initial code was a “government-initiated ‘self-regulatory’ instrument” that did not genuinely engage with stakeholders. The 2016 ‘Code of Conduct on Countering Illegal Hate Speech Online’ was similarly reported to be developed “at the behest of the European Commission under the threat of introducing statutory regulation” with the ‘systematic exclusion’ of civil society groups.
The tension between a soft and a hard approach clearly manifests, and the waters are further muddled due to the unusual temporal approach whereby codes were developed before the DSA’s final text had even been approved. Since the DSA – and by extension its codes of conduct – deals with fundamental societal issues such as discrimination, social inequality, and disinformation, it is crucial to involve societal stakeholders correctly. The stakes have never been higher.
Title photo: Sam Pak / unsplash