In the third part of our mini-series looking at the continuing rise of Consumer Litigation in the EU we consider the impact of increased digital regulation and how it is adding to the rise of Consumer Litigation in the EU.
The Digital Decade
In the face of emerging digital development, the EU has decided to place itself as a frontrunner in the governance and regulation of digital technologies. In fact, the European Commission has publicized its determination “to make this Europe’s Digital Decade”. In accordance with this sentiment, EU legislators have either enacted or are proposing several new laws, which in addition to their impact on EU businesses and consumers will also have extra-territorial effect outside of the EU as well. In addition to regulating the digital space which we now all occupy, these new laws will also provide consumers with the right to litigate to hold these businesses accountable for any breaches so it is important for businesses to understand what their obligations will be and ensure they are compliant.
What are the new digital EU regulations?
Status of the legislative process | Expected or determined entry into force | Implementation by Member States | |
Artificial Intelligence Act | The European Parliament passed its version of the Act on 14 June 2023. This version was considered during the final trilogue which took place from 6 December to 8 December 2023. The Act is now going through lawyer linguistic cleaning. A ‘clean’ text is likely to be available just ahead of the European Parliament’s plenary vote to endorse the final AI Act, which is currently scheduled for the 10-11 April session. | Not yet known but possibly by the end of the second quarter of 2024 but not earlier than May 2024. | The majority of the Act will apply in Member States two years following the Act’s entry into force, meanwhile specific provisions will have different timelines. Prohibited systems will apply 6 months after the entry into force, requirements for General Purpose AI will apply after one year, and some high-risk system requirements will apply after 3 years. |
Digital Services Act | N/A as the DSA is in force. | The Act entered into force on 16 November 2022. | The Act will be directly applicable in Member States from 17 February 2024. |
Digital Markets Act | N/A as the DMA is in force. | The Act entered into force on 1 November 2022. | The rules under the Act have applied since 2 May 2023. Obligations for designated gatekeepers will apply as of March 2024. For more information see below. |
Omnibus Directive | N/A as the Directive is in force. | The Directive entered into force on 7 January 2022. | Several Member States failed to meet the transposition deadline and were sent a letter of formal notice. Most States have implemented the Directive, with the exception of Slovakia and Austria which are still in the process of doing so. |
Artificial Intelligence Act – The Artificial Intelligence Act (AI Act) was proposed by the European Commission in 2021. The AI Act introduces the world’s first regulatory and legal framework to govern artificial intelligence across different sectors and covers different types of AI. The adopted definition for ‘AI system’ under the newest draft of the Act is “a machine-based system designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments”. The Act sets up a risk-based structure, where businesses establish what requirements and obligations come with the use of the AI by identifying how potentially risky a given AI system is to users. The most recent draft of the Act defines ‘risk’ as “the combination of the probability of an occurrence of harm and the severity of that harm”. The latest trialogue between EU legislators brought forth an agreement on classifications for specific high-risk areas. Systems in high-risk sectors will be subject to more obligations in the aim of regulating and monitoring AI employed in domains with a significant societal impact. Some notable examples include education, employment, critical infrastructure, essential public and private services, law enforcement, and the administration of justice. Further agreements were reached regarding the prohibitions on AI systems, which include, but are not limited to, manipulative techniques, social scoring, predictive policing, exploitation of vulnerable groups, and emotion recognition (with narrow exceptions).
The Act will establish new procedural and quality rules for ‘providers’, those involved in the development or distribution of AI systems, and this includes companies who internally employ AI systems vis-à-vis their employees and consumers. Examples of the internal use of AI systems include AI contracting tools, chatbots, consumer profiling techniques, big-data analysis of consumers, automatization of transactions, the management of consumer complaints and predictive technologies. Moreover, the AI Act will be complimented by the European Commission’s proposal for an Artificial Intelligence Liability Directive, which seeks to introduce harmonised rules on non-contractual civil liability for damages caused by AI systems. For further information on the AI Act please see our most recent article here.
Digital Services Act and Digital Markets Act – The Digital Services Act (DSA) and Digital Markets Act (DMA) were proposed alongside each other by the European Commission. Both entered into force in November 2022, with the DSA rules applying after February 2024 and the DMA rules after March 2024. The two Acts aim to protect fundamental rights of users in digital spaces and to create a level playing field for businesses.
The DMA establishes criteria to identify “gatekeepers”, which are large digital platforms providing core platform services. Gatekeepers are expected to comply with obligations and prohibitions listed in the DMA, which will make markets in the digital sector fairer and more contestable. By reviewing user numbers, the European Commission designated six companies as gatekeepers, those being Alphabet, Amazon, Apple, Meta, Microsoft, and ByteDance. The Commission’s decision did not go unchallenged, as Amazon, Meta, ByteDance, and Apple contested their designation as gatekeepers under the DMA. At the same time, the Commission opened several market investigations to assess which of these companies’ core platform services qualify as gateways between businesses and consumers . So far, twenty-two core platform services have been designated as gateways. On the other hand, the DSA aims to heighten safety online by making it easier to report illegal content, tackle cyber bullying, limit targeted advertising, contest moderation decisions, and simplify terms and conditions.
The new rules under the Act will apply to all online service providers regardless of whether they are established in or outside of the EU. The obligations imposed on companies will be proportionate to their role, size, and impact online. For this reason, the DSA establishes four categories: very large online platforms and search engines (VLOPs and VLOSEs) with at least 45 million monthly active users, online platforms which bring together sellers and consumers, hosting services, and intermediary services. On 25 April 2023, the Commission designated the first VLOPs and VLOSEs, and made further designations in December 2023. Going forward, the Commission will work with national authorities to ensure compliance of platforms and will be primarily responsible for the monitoring and enforcement of additional compliance by VLOPs and VLOSEs.
Ultimately, the two Digital Acts will prompt shifts in the strategies of digital spaces caught within their remits. These digital spaces will need to adjust their online offerings to be compliant both in the consumer facing part of their operation and to meet the numerous new digital obligations imposed on them. For more information on these Acts please see here (DMA) and here (DSA).
Omnibus Directive – Finally, there is the Omnibus Directive (the ‘Directive’). Also known as the Enforcement and Modernisation Directive, this legislation aims to strengthen consumer protection by modernising existing EU legislation and upgrading enforcement measures in this arena. It does so by amending four significant EU consumer directives on price indications (98/6/EU), consumer rights (2011/83/EU), unfair contract terms (93/13/EEC), and unfair commercial practices (2005/29/EC). The Directive applies mainly to “traders”, which in practice refers to B2C stores in the EU, B2C e-commerce companies which target EU consumers, and B2C companies which provide free services to EU consumers. From a consumer litigation perspective, this will give rise to some important changes. Firstly, businesses conducting business online will face enhanced enforcement measures and increased transparency criteria, thereby strengthening consumer rights. Secondly, for the first time ever, consumer protection rights will apply to digital services in exchange for personal data instead of money as payment.
The Directive requires Member States to transpose its provisions from 28 May 2022, however this transposition is still in progress across some States. The current status of its transposition can be visualized on our interactive map here.
What new consumer rights will arise?
Each of the new pieces of legislation grants consumers a number of additional rights with aim of targeting unfair commercial practices.
As reported by the European Consumer Organisation (BEUC)[1], Members of the European Parliament voted to strengthen protections for consumers in the AI Act. As an outcome of the plenary vote, consumers have been granted numerous rights, including the right to be informed when being subjected to a decision from a high-risk (strictly regulated) AI system, the right to submit complaints to an authority regarding an AI system, and the right to bring a supervisory authority to court if it fails to take action on the complaint. Consumers are also given the right to invoke collective redress in the case of an AI system which has caused harm to a group of consumers. This has taken on increased relevance following the implementation of the new collective redress mechanism in the Representative Actions Directive (RAD) (see our tool tracking the implementation of this legislation here). As discussed in our first article in this series, the RAD is making the process of consumer class actions more accessible and harmonized across the European Union.
What’s more, the RAD is closely linked with the Omnibus Directive, as they address the same challenges in targeting widespread unfair commercial practices that are eroding trust in the EU Single Market. Hence, the Omnibus Directive amends Directive 2005/29/EC to provide the right to redress for consumers who suffered damage from unfair commercial practices[2].
The DMA establishes new rights for consumers in addition to rights to bring collective actions. It specifically envisages class actions in Article 42, referring to the RAD, and states it “shall apply to the representative actions brought against infringements by gatekeepers of provisions of this Regulation that harm or may harm the collective interests of consumers”. Likewise, the DSA awards consumers with new rights, including rights to complain to the platform, seek out-of-court settlements, complain to the consumers relevant national authority in their own language, or seek compensation for breaches of the rules. The DSA also establishes collective action mechanisms, allowing for representative organizations to defend consumer rights for large scale breaches of the law. In addition, it reduces the costs of legal proceedings for claimants, both for litigation as well as alternative dispute resolution. The DSA specifies that claimants are entitled to select any out-of-court settlement to resolve disputes that could not be resolved via the platform’s internal complaint system, and they additionally may be reimbursed for any costs acquired from this procedure[3]. Consumers are given the option of choosing between the platform’s internal complaint mechanism, arbitration proceedings, or judicial proceedings, and the fees, on top of being proportionate and assessed on a case-by-case basis, are to be reasonable, accessible, attractive, and inexpensive for consumers[4].
Across the board, proposed digital regulation in the EU will introduce new rights for consumers and will complement the new collective redress legislation. This in turn will increase the risk for claims against businesses not only in the tech sector but also businesses in other industries who implement digital services with a consumer interface. In the current digital era, most companies worldwide will be required to ensure and demonstrate conformity with digital regulation and this increase in regulation will lead to a corresponding increase in litigation seeking to ensure that businesses are held accountable for any breaches of it.
With thanks to Giorgia Loredan for her help in putting this article together.
TO READ PART 1 OF THIS SERIES ON THE GROWTH IN CONSUMER CLASS ACTIONS IN THE EU CLICK HERE
TO READ PART 2 OF THIS SERIES ON THE RISE OF ESG CLAIMS IN THE EU CLICK HERE
For further information, please contact:
Evelyn Tjon-En-Fa, Partner, Bird & Bird
evelyn.tjon-en-fa@twobirds.com
[1] BEUC, Press release on 14 June 2023
[2] Omnibus Directive, Article 3(5)
[3] DSA, 2022, Article 18
[4] DSA, 2022, Preamble, Article 59