Summary
A UK court has reversed a fine imposed on the provider of a facial image database service, Clearview AI, on the basis that the (UK) GDPR did not apply to the processing of personal data by the company. In so doing, the court has provided helpful judicial interpretation of both the territorial and material scope of UK data protection law.
The key takeaways were:
- A controller or processor may be caught by the extra-territorial scope of the UK GDPR on the basis that its processing activities relate to the monitoring of the behaviour of data subjects in the UK, even where that entity is not itself monitoring data subjects, but where its activities enable its customers to conduct such monitoring.
- A reminder that processing activities that are carried out for, or connected to, law enforcement purposes – for example, where a company provides its services solely to law enforcement agencies – will fall outside of the scope of the UK GDPR. If those law enforcement agencies are in the UK, then the processing will instead be subject to the parallel law enforcement processing regime under the Data Protection Act. However, if the law enforcement agencies are outside of the UK (as Clearview AI’s customers were) then UK data protection law will not engage.
Background
On 18 May 2022, the Information Commissioner’s Office brought twin-track enforcement action against Clearview AI in the form of: (1) an Enforcement Notice; and (2) a Monetary Penalty Notice (i.e., a fine) in the amount of GBP 7.5 million.
The ICO had concluded that Clearview AI:
- was a controller of personal data under the GDPR as applied in the EU, and under the UK GDPR and Data Protection Act 2018 with respect to the UK data protection framework; and
- was or had been processing personal data of UK residents within the scope of the GDPR (in respect of processing taking place up to the end of the Brexit transition period of 23:00 on 31 December 2020) and the UK GDPR (in respect of subsequent processing).
The ICO concluded that Clearview AI had infringed whole rafts of the UK GDPR and GDPR in respect of the requirements:
- represented by the core data protection principles under Article 5;
- as to lawfulness of processing under Article 6;
- around the processing of special category data under Article 9;
- represented by the transparency obligations under Article 14;
- represented by the data subject rights to subject access (under Article 15), to rectification (under Article 16), to erasure (under Article 17), and the right to object (under Article 21);
- as to automated decision making under Article 22; and
- to undertake a data protection impact assessment under Article 35.
Preliminary Issue
Clearview AI appealed the enforcement notice and the monetary penalty to the First-tier Tribunal (Information Rights). The matters before the Tribunal did not relate to whether Clearview AI had infringed the GDPR or UK GDPR. The issue under consideration was solely relating to the jurisdictional challenge brought by Clearview AI.
It primarily considered three questions as to:
- whether as a matter of law, Article 3(2)(b) can apply where the monitoring of behaviour is carried out by a third party rather than the data controller;
- whether as a matter of fact, processing of data by Clearview AI was related to monitoring by either Clearview AI itself or by its customers; or
- whether the processing by Clearview AI was beyond the material scope of the GDPR by operation of Article 2(2)(a) GDPR and/or was not relevant processing for the purposes of Article 3 of the UK GDPR thereby removing the processing from the material scope of the UK GDPR.
Clearview AI argued that the data processing undertaken by it in the context of the services was outside the territorial scope of the GDPR and UK GDPR, with the consequence that the ICO had no jurisdiction to issue the notices.
Clearview AI services
Clearview AI provides law enforcement agencies with access to a database of facial images scraped from public sources. It uses biometric processing to match a customer’s image with an image and identity in its database. In detail, its services were broadly achieved through the following phases of activity:
Activity 1
- It copied and scraped facial images in photographs that it found across the public internet and stored them with a series of connected databases and sub-databases, linked to the source of the image, the date and time it was made, and information around associated social media profiles.
- That material was then used for the creation of a set of vectors for each facial image, using the Clearview AI machine learning facial recognition algorithm.
- The facial vectors were then stored in a neural-network database, clustered together according to closeness in facial similarities.
Activity 2
- If a customer wished to use the service, they would upload the facial image being searched for to the Clearview AI system. Vectors would be created of the facial features of that image, which would then be compared to the facial vectors of the stored images on the database using the Clearview AI machine learning facial recognition algorithm. Up to 120 matching facial images would be returned, along with an assessment of the degree of similarity. The service was found to achieve over 99% accuracy.
Further Activity
- The returned images would allow a customer to then view additional, non-Clearview AI derived information, (such as by visiting the source page where the image was scraped from) as to:
- the person’s name;
- the person’s relationship status, whether they have a partner and who that may be;
- whether the person is a parent;
- the person’s associates;
- the place the photo was taken;
- where the person is based/lives/is currently located;
- what social media is used by the person;
- whether the person smokes/drinks alcohol;
- the person’s occupation or pastime(s);
- whether the person can drive a car;
- what the person is carrying/doing and whether that is legal; and/or
- whether the person has been arrested.
The Tribunal considered that it was reasonably likely that the database would contain the images of UK residents and/or images taken within the UK of persons resident elsewhere. It was therefore found that the Clearview AI service could have an impact on UK residents, irrespective of whether it was used by UK customers.
The Clearview AI service was used by customers for commercial purposes prior to 2020 and is not currently used by customers in the UK or in the EU at all. Its customers are in the United States, as well as other countries globally (including Panama, Brazil, Mexico, and the Dominican Republic). It was acknowledged that investigators in one country may be interested in behaviour happening in another country, given that criminal activity is not limited by national boundaries.
Clearview AI had offered its service on a trial basis to law enforcement and government organisations within the UK between June 2019 and March 2020. An overseas law enforcement agency could use the service as part of an investigation into the alleged criminal activity of a UK resident.
Conclusions of the Tribunal
The Tribunal considered that Clearview AI was the sole controller responsible for Activity 1 (as described above) and that Clearview AI was joint controller with its customers for Activity 2. The Further Activity was then processing for which Clearview AI was not a controller at all.
The ICO submitted that the Clearview AI service was being used to monitor the behaviour of the data subjects. The Tribunal concluded that Clearview AI did not monitor behaviour itself but that its customers used the service to monitor the behaviour of data subjects. Consequently, for the purposes of Article 3(2)(b), Clearview AI’s services were related to the monitoring of the behaviour of data subjects. Clearview AI’s status as a joint controller with its customer for the purposes of Activity 2 may have been a significant factor in establishing a sufficiently close nexus between Clearview AI, as the ‘service provider’, and its customer, as the entity actually conducting the behavioural monitoring.
However, whilst the processing activities may in theory have been within the territorial scope of the (UK) GDPR, what was decisive was that they fell outside of its material scope. The Tribunal accepted that Clearview AI offered its services exclusively to non-UK/EU law enforcement and national security agencies, and their contractors, in support of the discharge of their respective criminal law enforcement and national security functions. Such activities fall outside of the scope of the GDPR and the UK GDPR. Whilst the UK has data protection regimes under the Data Protection Act 2018 that apply to both law enforcement and intelligence agencies, those regimes only bind processing activities relating to UK law enforcement or intelligence agencies.