Background
The European Court of Justice (“CJEU”) in mid-2023 passed a landmark judgment in Meta Platforms Inc. v. Bundeskartellamt[1], by imposing strict restrictions on social media entities using personal data of consumer’s for targeting them with personalised advertisements through their platforms. This ruling struck at the core revenue model of many big technology organisations.
The genesis of this decision can be traced to when the German anti-trust watchdog, the Federal Cartel Office (“FCO”), had prohibited Meta Platforms Inc. (“Meta”) from processing non-Facebook data of users without their informed consent. The FCO had laid down that such data cannot be collected or linked with the Facebook accounts of users without their consent. It clarified that user consent was not valid if it was a pre-requisite to use the platform’s services.
The FCO opined that Meta had abused its dominant market position and the manner in which they processed personal data was inconsistent with the intrinsic principles of the General Data Protection Regulation (“GDPR”). Meta appealed against the decision of the FCO before the Higher Regional Court of Düsseldorf, Germany, which stayed the proceedings and referred the case to the CJEU.
Impermissible processing of personal data
Article 9(1) of the GDPR provides that “Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be prohibited.”
Meta users visit websites or apps and use the ‘Like’ or ‘Share’ buttons, thus providing information falling within one or more of the special categories of personal data mentioned in Article 9(1). Meta processes personal data through its cookies and integrated interfaces, when users enter information into such websites or apps and Meta links such data with a user’s social media account. In cases where the user of a social media network visits websites or apps relating to one or more of the categories referred to in Article 9(1), then usage and linkage of such user data by Meta is prohibited under Article 9(1).
There are several exceptions to Article 9(1), one of which relates to personal data being made public by the data subject or the user itself.[2] The court held that disclosure shall be deemed to be made public only if the user had intended, explicitly and by a clear affirmative action, to make the personal data public. It cannot be assumed that a user visiting an app or website through the interface of a platform intended that the personal data in question be collected by an online platform through cookies or similar storage technologies. Data can only be considered to be made public under Article 9(2)(e) of the GDPR if the user identifies itself with its personal information, based on an explicit choice, basis an individual setting that is voluntarily selected, with full knowledge of facts, and with the intent to make the data accessible to the public at large.
Concise contours of the “Necessity” condition
Article 6(1)(b) provides that “processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract”. The court held that ‘necessary for the performance of a contract’ shall mean if such processing is objectively indispensable for a purpose that is integral to the contractual obligation intended for the data subject. A controller must be able to show how the purpose and goals of the contract cannot be achieved if processing does not take place. Processing of personal data must be essential for the purpose of the contract and not just useful.
CJEU accepted that personalised advertising may actually be useful for a consumer and some consumers would want that service. However, personalised advertising is not necessary for a social media platform to adequately and satisfactorily offer its services.
Meta argued that processing of personal data of users is necessary to improve the quality of other services and platforms offered by Meta. The court pointed out that products and services offered by Meta are used independently and separately. Each distinct service has a separate user agreement. Accordingly, processing of personal data from other services is not necessary for the operation of its social media services.
“Legitimate Interests” – Clearing the air or more convolution?
Article 6(1)(f) lays down that “processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child”.
CJEU has laid down that a legitimate interest in processing personal data shall be considered acceptable only if processing cannot be achieved as equally effectively through less restrictive means.[3] Platforms have mentioned that processing of personal data enables personalised advertising, network security, product improvement and sharing of information with law enforcement agencies.
The services of platforms like Facebook are free of charge, but a user still cannot be expected to assume that their personal data can be processed without consent for personalised advertisement. The interests and rights of a user override the interest of the platform in its personalised advertising revenue model by which it finances its activity. Thus, the court ruled that such processing by platforms for personalised advertising cannot be protected under Article 6(1)(f) of the GDPR.
The court held that network security could be a legitimate interest for processing personal data. However, it would depend on the extent to which the processing of non-Facebook personal data would be necessary to maintain network security and integrity. It would also have to be seen if there are any other less restrictive options and if the data minimisation[4] principle underlying GDPR is being followed.
Product improvement may also be a legitimate interest for processing personal data, but it would have to be balanced with the extent of the user’s personal data being processed and the possible impact on them. CJEU was clear that sharing of information with law enforcement agencies cannot be considered as a fair legitimate interest as it is unrelated to commercial activities.
Lawful consent to processing of personal data
The court opined that the dominant position of a social media platform like Meta alone should not stand in the way of valid user consent. However, the court also acknowledged that consent is not a black and white framework and can be influenced by different factors. A dominant entity or a controller is at an unequal footing with the data principal. This gives it the power to import onerous conditions in the contract that are favourable for it but not necessary for the performance of the contract.
CJEU also laid down that a user must have full discretion to withhold their consent to data processing activities that are not essential for the performance of the contract. Users must be offered equal alternatives that do not have the tag-along requirement of data processing operations. Users should be given an option to accord separate consent for processing of their data on the platform and for data off the platform. There cannot be a presumption of consent for non-Facebook data in Meta’s context.
Impact
This decision may potentially be a cause of concern for technology companies globally that rely on personal data-based advertising as a principal source of revenue generation. On July 17, 2023, the Norwegian Data Protection Authority (“NDPA”) became the first government authority to swing into action on the basis of this judgment.
The NDPA has banned Meta from running behavioural advertising on Facebook and Instagram in Norway unless it obtains users’ consent for such processing. However, Meta was allowed to run other forms of online advertising which did not track and profile user data. In November 2023, the EU data regulator extended the ban on behavioural advertising by Norway on Facebook and Instagram to cover all 30 countries in EU.[5]
As a result, Meta has now launched its first ad-free option for Facebook and Instagram in EU. Users who are 18 years old or above have been given an option to decide if they want to use Meta’s services for free and be subject to targeted ads based on their personal data or pay 10 pounds a month for ad-free access.[6] Meta has also paused targeted advertisements for users below the age of 18.[7]
India’s new Digital Personal Data Protection Act, 2023, interestingly does not go after targeted advertising for adults. It only bars Data Fiduciaries (“DFs”) from undertaking tracking, behavioural monitoring or targeted advertising directed at children.[8] This is not a hard bar and the Central Government may grant exemptions for certain classes of DFs and for entities who have proven to the government that they are processing personal data of children in a verifiably safe manner.
It however remains to be seen how the prospective Data Protection Board will assess such situations when they arise. Moreover, with proposed legislations such as the Digital India Bill and the Digital Competition Bill possibly on the anvil, Indian policymakers may sooner or later have to take a stance on how they intend to treat targeted advertising.
[1] Case C-252/21, ECLI:EU:C:2023:537.
[2] Article 9(2)(e), GDPR.
[3] B v. Latvijas Republikas Saeima, C‑439/19, ECLI:EU:C:2021:504.
[4] Article 5(1)(c), GDPR.
[5] Facebook owner Meta faces EU ban on targeted advertising | Reuters.
[6] Facebook and Instagram get ad-free subscription service in Europe (cnbc.com).
[7] Meta ads: Meta to limit ads targeting teens on Facebook, Instagram – The Economic Times (indiatimes.com).
[8] Section 9(3) of the Digital Personal Data Protection Act, 2023.