The UK Information Commissioner has fined Clearview AI £7.5m and ordered it to stop processing data about UK residents. This raises important questions about both the theoretical jurisdiction of the UK GDPR and the practical limits on extra-territorial enforcement. We consider what might happen next.
The Clearview AI tool
Clearview AI operates a facial recognition tool. A customer can submit an image that Clearview AI then turns into a set of facial vectors. Those facial vectors are then used to search Clearview AI’s database, which is said to extend to some 20 billion images scraped from the public internet.
The tool returns potential matches alongside with metadata and URLs associated with the original image. The intention is that this additional information can then be used to identify the individual in the image.
The primary purpose appears to be law enforcement. In the UK, at least five law enforcement authorities carried out a trial of the system and conducted 721 searches on the system in order to try and identify the individuals.
However, this technology is apparently used for other purposes. For example, Clearview AI has offered its services to the Government of Ukraine in connection with the Russian invasion to identify combatants and the deceased on both sides of the conflict.
Extra-territorial application of the UK GDPR
Clearview AI does not have any presence in the UK and, save for the trial described above, does not offer its services to customers in the UK.
This means that the only basis upon which it could be subject to the UK GDPR/GDPR as a result of monitoring the behaviour of individuals in the UK (Art 3(2)(b)). The Information Commissioner concluded that Clearview AI was caught on this basis. The primary reason for this is that:
- When a customer tries to match the photo of an individual with the faces in Clearview AI’s database it is “monitoring of the behaviour” of that individual. In particular, by reviewing the outputs of that search, the customer is seeking to “ascertain information about a particular individual’s behaviour…over a period of time”.
- While Clearview AI is not directly monitoring the individual’s behaviour, it’s processing is related to monitoring carried out by the customer. That monitoring could not take place without the Clearview AI service.
- Clearview AI refused to disclose how many of the 20 billion odd images in its database relate to UK residents. However, given it takes no steps to exclude UK residents from its database, the Information Commissioner concluded it was “inevitable” a large number of images would be from the UK.
Perhaps recognising this is not a conventional interpretation of the monitoring test, the Information Commissioner notes that any other construction would mean data-scraping companies outside the UK would escape the UK GDPR/GDPR; “such a construction is inconsistent with the purposes of the GDPR and UK GDPR, in particular their purpose of providing a high degree of protection to data subjects”.
The conclusions on breach were more robust. Clearview AI did not even try to argue it complied with the UK GDPR/GDPR. The Information Commissioner suggests this is because it “would be hopeless” to do so, finding breaches of:
- Fair and lawful (Art 5(1)(a)): Screen scraping is not fair and lawful, as individuals would not be aware that this processing had taken place. The images may have been taken from the public internet but that does not mean they can be freely used. In particular, they may not have been uploaded by the individual so there can be no assumption the individual is content for them to be made public. The images will also remain in Clearview AI’s database even if they are removed from the internet.
- Retention (Art 5(1)(e)): Clearview AI did not have a retention policy and did not appear to have any process to delete old data.
- Legal basis and special category personal data (Art 6 and 9(1)): Facial biometrics count as “special category data”, so can only be processed if a condition in Art 9 is satisfied. Clearview AI did not, apparently, even try to argue such a condition is met. There is also no legal basis for this processing under Art 6.
- Transparency (Art 14): The individuals whose images were obtained were not informed of this processing.
- Rights (Art 15-22): To have an image removed, individuals had to submit a photograph, which the Information Commissioner considered was a “significant fetter” on that right. In any event, Clearview AI had ceased to delete images on request.
- DPIA (Art 35): No data protection impact assessment had been carried out.
Given the primary purpose of the Clearview AI database is the prevention and detection of crime, it is interesting that Clearview AI did not event attempt to counter these findings. For example, this could potentially provide a legal basis, a justification for processing special category data and an exemption from a number of data subjects rights.
However, Clearview AI may well have considered it was still hopeless to argue it complied with the UK GDPR/GDPR given the totality of the infringements and the fact that their service was being used for other purposes (such as in the Ukrainian war).
Theoretical enforcement – A £7.5m fine and order to stop processing
Based on the statutory factors in Article 83(2), the Information Commissioner issued a Monetary Penalty Notice fining Clearview AI £7,552,800.
This is surprisingly low. Given the significant breaches described above and the very large-scale processing of data about UK residents, this processing seems to warrant a larger fine. Indeed, the Information Commissioner originally proposed a fine £17.5m. No reason is given for the significant discount made to the final fine issued to Clearview AI.
The Information Commissioner also issued an Enforcement Notice ordering Clearview AI to stop scraping data about UK residents and to delete any information about UK residents from its database. Literal compliance with this Enforcement Notice is technically impossible – how do you know if someone is a UK resident or not? However, in US proceedings (Mutnick v Clearview AI) Clearview AI has already committed to:
- Block all photos in the database that were geolocated in Illinois from being searched.
- Construct a “geofence” around Illinois.
- Not collect facial vectors from images that contain metadata associated with Illinois.
- Not collect facial vectors from images stored on servers that are displaying Illinois IP addresses or websites with URLs containing keywords such as “Chicago” or “Illinois”.
The Information Commissioner considers that adopting comparable steps in relation to UK residents would be sufficient for “practical compliance” with the Enforcement Notice.
It is not clear how Clearview AI will respond. The early signs are not promising and suggest that it disputes the Information Commissioner’s jurisdiction and so will just ignore the Monetary Penalty Notice or the Enforcement Notice.
All should become clear by 17 June. If Clearview have neither paid the fine nor lodged an appeal by that date, the next step for the Information Commissioner would be to get an order from the High Court for payment of the fine. Failure to comply with that order is then contempt of court, but given Clearview AI has no presence in the UK, even that might not be sufficient to ensure compliance.
Perhaps signalling an awareness of this risk and a shift towards a more pragmatic approach to enforcement, during an interview with “Politico”, John Edwards (the Information Commissioner) said he offered to waive Clearview AI’s fine completely if they agreed to delete data and not offer services in the UK. However, Clearview AI declined.
A copy of the Monetary Penalty Notice and Enforcement Notice are here.
For further information, please contact:
Richard Cumbley, Partner, Linklaters