BAILII is celebrating 24 years of free online access to the law! Would you consider making a contribution?
No donation is too small. If every visitor before 31 December gives just £1, it will have a significant impact on BAILII's ability to continue providing free access to the law.
Thank you very much for your support!
[Home] [Databases] [World Law] [Multidatabase Search] [Help] [Feedback] | ||
England and Wales Court of Appeal (Civil Division) Decisions |
||
You are here: BAILII >> Databases >> England and Wales Court of Appeal (Civil Division) Decisions >> Bridges, R (On the Application Of) v South Wales Police [2020] EWCA Civ 1058 (11 August 2020) URL: http://www.bailii.org/ew/cases/EWCA/Civ/2020/1058.html Cite as: [2020] HRLR 16, [2020] WLR 5037, [2020] 1 WLR 5037, [2020] EWCA Civ 1058, [2021] 2 All ER 1121, [2021] 1 Cr App R 4 |
[New search] [Printable PDF version] [Buy ICLR report: [2020] 1 WLR 5037] [Help]
ON APPEAL FROM THE HIGH COURT OF JUSTICE
QUEEN'S BENCH DIVISION (ADMINISTRATIVE COURT)
CARDIFF DISTRICT REGISTRY
Haddon-Cave LJ and Swift J
Strand, London, WC2A 2LL |
||
B e f o r e :
THE PRESIDENT OF THE QUEEN'S BENCH DIVISION
and
LORD JUSTICE SINGH
____________________
R (on the application of Edward BRIDGES) |
Appellant/Claimant |
|
- and - |
||
THE CHIEF CONSTABLE OF SOUTH WALES POLICE |
Respondent/Defendant |
|
- and - |
||
THE SECRETARY OF STATE FOR THE HOME DEPARTMENT |
Interested Party |
|
- and - |
||
THE INFORMATION COMMISSIONER (1) THE SURVEILLANCE CAMERA COMMISSIONER (2) THE POLICE AND CRIME COMMISSIONER FOR SOUTH WALES (3) |
Interveners |
____________________
Jason Beer QC and Francesca Whitelaw (instructed by Special Legal Casework, South Wales Police) for the Respondent
Richard O'Brien and Thomas Yarrow (instructed by the Government Legal Department) for the Interested Party
Gerry Facenna QC and Eric Metcalfe (instructed by the Information Commissioner's Office) for the First Intervener
Andrew Sharland QC and Stephen Kosmin (instructed by the Government Legal Department) for the Second Intervener
Fiona Barton QC (instructed by South Wales and Gwent Police Joint Legal Services) made written submissions for the Third Intervener
Hearing dates : 23-25 June 2020
____________________
Crown Copyright ©
COVID-19 PROTOCOL: This judgment was handed down remotely by circulation to the parties' representatives by email, and by release to the British and Irish Legal Information Institute (BAILII) for publication. Copies were also made available to the Judicial Office for dissemination. The date and time of handing down shall be deemed for all purposes to be 10:00am on Tuesday 11 August 2020
Sir Terence Etherton MR, Dame Victoria Sharp PQBD and Lord Justice Singh :
The Parties
AFR and its deployment by SWP
AFR Technology
(1) Compiling/using an existing database of images. AFR requires a database of existing facial images (referred to in this case as "a watchlist") against which to compare facial images and the biometrics contained in them. In order for such images to be used for AFR, they are processed so that the "facial features" associated with their subjects are extracted and expressed as numerical values.
(2) Facial image acquisition. A CCTV camera takes digital pictures of facial images in real time. This case is concerned with the situation where a moving image is captured when a person passes into the camera's field of view, using a live feed.
(3) Face detection. Once a CCTV camera used in a live context captures footage, the software (a) detects human faces and then (b) isolates individual faces.
(4) Feature extraction. Taking the faces identified and isolated through "face detection", the software automatically extracts unique facial features from the image of each face, the resulting biometric template being unique to that image.
(5) Face comparison. The AFR software compares the extracted facial features with those contained in the facial images held on the watchlist.
(6) Matching. When facial features from two images are compared, the AFR software generates a "similarity score". This is a numerical value indicating the likelihood that the faces match, with a higher number indicating a greater likelihood of a positive match between the two faces. A threshold value is fixed to determine when the software will indicate that a match has occurred. Fixing this value too low or too high can, respectively, create risks of a high "false alarm rate" (i.e. the percentage of incorrect matches identified by the software) or a high "false reject rate" (i.e. the percentage of true matches that are not in fact matched by the software). The threshold value is generally suggested by the manufacturer, and depends on the intended use of the AFR system. Most AFR systems, however, allow the end user to change the threshold value to whatever they choose.
SWP's use of AFR
Data retention
(1) CCTV feed to AFR Locate deployments: retained for 31 days with automatic deletion as part of the "Milestone" software.
(2) Facial images that are not matched against: immediately deleted.
(3) Biometric template (regardless whether match made): immediately deleted.
(4) Facial images alerted against: images either deleted immediately following the deployment, or, at the latest, within 24 hours following the deployment.
(5) Match report to include personal information (name of individual alerted against): retained for 31 days.
(6) Watchlist images and related biometric template: deleted immediately following the deployment, or at the latest within 24 hours following the deployment.
Public awareness of when AFR Locate is used
Biometric Data
Oversight and Advisory Board
The specific incidents giving rise to these proceedings
21 December 2017 Deployment
27 March 2018 deployment
Relevant legal framework
The proceedings
The Divisional Court's judgment
Article 8 of the Convention
Data Protection claims
"For the moment, we confine ourselves to the above observations. Given the role of the Information Commissioner and the prospect of further guidance, we do not think it is necessary or desirable for this Court to interfere at the present juncture and decide whether the SWP's current November 2018 Policy Document meets the requirements of section 42(2) of the DPA 2018. In our view, the development and specific content of that document is, for now, better left for reconsideration by the SWP in the light of further guidance from the Information Commissioner."
The PSED claim
The appeal
Ground 1: The Divisional Court erred in concluding that the interference with the Appellant's rights under Article 8(1) of the Convention, taken with section 6 of the HRA 1998, occasioned by SWP's use of AFR on 21 December 2017 and 27 March 2018 and on an ongoing basis, was/is in accordance with the law for the purposes of Article 8(2).
Ground 2: The Divisional Court made an error of law in assessing whether SWP's use of AFR at the December 2017 and March 2018 deployments constituted a proportionate interference with Article 8 rights within Article 8(2). The Divisional Court failed to consider the cumulative interference with the Article 8 rights of all those whose facial biometrics were captured as part of those deployments.
Ground 3: The Divisional Court was wrong to hold that SWP's DPIA complied with the requirements of section 64 of the DPA 2018. The DPIA is based on two material errors of law concerning the (non)engagement of the rights in Article 8 of the Convention and the processing of the (biometric) personal data of persons whose facial biometrics are captured by AFR but who are not on police watchlists used for AFR.
Ground 4: The Divisional Court erred in declining to reach a conclusion as to whether SWP has in place an "appropriate policy document" within the meaning of section 42 of the DPA 2018 (taken with section 35(5) of the DPA 2018), which complies with the requirements of that section. Having in place such a document is a condition precedent for compliance with the first data protection principle (lawful and fair processing) contained in section 35 of the DPA 2018 where the processing of personal data constitutes "sensitive processing" within the meaning of section 35(8) of the DPA.
Ground 5: The Divisional Court was wrong to hold that SWP complied with the PSED in circumstances in which SWP's Equality Impact Assessment was obviously inadequate and was based on an error of law (failing to recognise the risk of indirect discrimination) and SWP's subsequent approach to assessing possible indirect discrimination arising from the use of AFR is flawed. It is argued that the Divisional Court failed in its reasoning to appreciate that the PSED is a continuing duty.
Discussion
Ground 1: Sufficient Legal Framework
"The general principles applicable to the 'in accordance with the law' standard are well-established: see generally per Lord Sumption in Catt, above, [11]-[14]; and in Re Gallagher [2019] 2 WLR 509 at [16] – [31]. In summary, the following points apply.
(1) The measure in question (a) must have 'some basis in domestic law' and (b) must be 'compatible with the rule of law', which means that it should comply with the twin requirements of 'accessibility' and 'foreseeability' (Sunday Times v United Kingdom (1979) 2 EHRR 245; Sliver v United Kingdom (1983) 5 EHRR 347; and Malone v United Kingdom (1985) 7 EHRR 14).
(2) The legal basis must be 'accessible' to the person concerned, meaning that it must be published and comprehensible, and it must be possible to discover what its provisions are. The measure must also be 'foreseeable' meaning that it must be possible for a person to foresee its consequences for them and it should not 'confer a discretion so broad that its scope is in practice dependent on the will of those who apply it, rather than on the law itself' (Lord Sumption in Re Gallagher, ibid, at [17]).
(3) Related to (2), the law must 'afford adequate legal protection against arbitrariness and accordingly indicate with sufficient clarity the scope of discretion conferred on the competent authorities and the manner of its exercise' (S v United Kingdom, above, at [95] and [99]).
(4) Where the impugned measure is a discretionary power, (a) what is not required is 'an over-rigid regime which does not contain the flexibility which is needed to avoid an unjustified interference with a fundamental right' and (b) what is required is that 'safeguards should be present in order to guard against overbroad discretion resulting in arbitrary, and thus disproportionate, interference with Convention rights' (per Lord Hughes in Beghal v Director of Public Prosecutions [2016] AC 88 at [31] and [32]). Any exercise of power that is unrestrained by law is not 'in accordance with the law'.
(5) The rules governing the scope and application of measures need not be statutory, provided that they operate within a framework of law and that there are effective means of enforcing them (per Lord Sumption in Catt at [11]).
(6) The requirement for reasonable predictability does not mean that the law has to codify answers to every possible issue (per Lord Sumption in Catt at [11])."
"In our view, there is a clear and sufficient legal framework governing whether, when and how AFR Locate may be used. … The legal framework within which AFR Locate operates comprises three elements or layers (in addition to the common law), namely: (a) primary legislation; (b) secondary legislative instruments in the form of codes of practice issued under primary legislation; and (c) SWP's own local policies. Each element provides legally enforceable standards. When these elements are considered collectively against the backdrop of the common law, the use of AFR Locate by SWP is sufficiently foreseeable and accessible for the purpose of the 'in accordance with the law' standard."
"96. Drawing these matters together, the cumulative effect of (a) the provisions of the DPA, (b) the Surveillance Camera Code and (c) SWP's own policy documents, is that the infringement of Article 8(1) rights which is consequent on SWP's use of AFR Locate, occurs within a legal framework that is sufficient to satisfy the 'in accordance with the law' requirement in Article 8(2). The answer to the primary submissions of the Claimant and the Information Commissioner, is that it is neither necessary nor practical for legislation to define the precise circumstances in which AFR Locate may be used, e.g. to the extent of identifying precisely which offences might justify inclusion as a subject of interest or precisely what the sensitivity settings should be (c.f. Lord Sumption in Catt at [14]). Taking these matters as examples, the Data Protection Principles provide sufficient regulatory control to avoid arbitrary interferences with Article 8 rights. The legal framework that we have summarised does provide a level of certainty and foreseeability that is sufficient to satisfy the tenets of Article 8(2). It provides clear legal standards to which SWP will be held. As to the content of local policies, we take account that AFR Locate is still in a trial period. The content of SWP's policies may be altered and improved over the course of this trial. The possibility (or even the likelihood) of such improvement is not evidence of present deficiency.
97. Finally, under this heading, we refer to the comments by the Home Secretary (in her Biometrics Strategy) as to the legal framework within which AFR Locate presently operates (see above, at paragraph 67). In our view, when considered in context, these comments should be considered as amounting to pragmatic recognition that (a) steps could, and perhaps should, be taken further to codify the relevant legal standards; and (b) the future development of AFR technology is likely to require periodic re-evaluation of the sufficiency of the legal regime. We respectfully endorse both sentiments, in particular the latter. For the reasons we have set out already, we do not consider that the legal framework is at present out of kilter; yet this will inevitably have to be a matter that is subject to periodic review in the future."
"A power on which there are insufficient legal constraints does not become legal simply because those who may have resort to it exercise self-restraint. It is the potential reach of the power rather than its actual use by which its legality must be judged." (Emphasis added)
"The case law shows that the court has paid very close attention to the facts of particular cases coming before it, giving effect to factual differences and recognising differences of degree."
"… in order for the interference to be 'in accordance with the law', there must be safeguards which have the effect of enabling the proportionality of the interference to be adequately examined. Whether the interference in a given case was in fact proportionate is a separate question."
"This appeal is concerned with the systematic collection and retention by police authorities of electronic data about individuals. The issue in both cases is whether the practice of the police governing retention is lawful …" (Emphasis added)
"… It lays down principles which are germane and directly applicable to police information, and contains a framework for their enforcement on the police among others through the Information Commissioner and the courts."
"There is some suggestion in the cases of a relativist approach, so that the more intrusive the act complained of, the more precise and specific must be the law said to justify it."
Data Protection Act 2018
"The purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security."
"The Divisional Court erred in concluding that the interference with the Appellant's rights under Article 8(1) … occasioned by the Respondent's use of live automated facial recognition technology … on 21 December 2017 and 27 March 2018 and on an ongoing basis was/is in accordance with the law for the purposes of Article 8(2) ECHR."
The Surveillance Camera Code of Practice
"(a) closed circuit television or automatic number plate recognition systems,
(b) any other systems for recording or viewing visual images for surveillance purposes,
(c) any systems for storing, receiving, transmitting, processing or checking images or information obtained by systems falling within paragraph (a) or (b), or
(d) any other systems associated with, or otherwise connected with, systems falling within paragraph (a) (b) or (c)."
"Use of a surveillance camera system must always be for a specified purpose which is in pursuit of a legitimate aim and necessary to meet an identified pressing need."
"Any use of facial recognition or other biometric characteristic recognition systems needs to be clearly justified and proportionate in meeting the stated purpose, and be suitably validated. It should always involve human intervention before decisions are taken that affect an individual adversely."
"Any use of technologies such as ANPR or facial recognition systems which may rely on the accuracy of information generated elsewhere such as databases provided by others should not be introduced without regular assessment to ensure the underlying data is fit for purpose."
"A system operator should have a clear policy to determine the inclusion of a vehicle registration number or a known individual's details on the reference database associated with such technology. A system operator should ensure that reference data is not retained for longer than necessary to fulfil the purpose for which it was originally added to a database."
SWP's local policies
"These individuals could be persons wanted on suspicion for an offence, wanted on warrant, vulnerable persons and other persons where intelligence is required."
(1) Individuals suspected of criminality and who are wanted by the courts and police.
(2) Individuals who may pose a risk to themselves and others.
(3) Individuals who may be vulnerable.
"Concerns have been raised by privacy experts that an individual may seek to enquire as to whether they have been included in a watchlist outside of the 24-hours retention period. Therefore, it has been deemed appropriate to be able to re-engineer watchlists. This can now be achieved via Niche RMS 'back-end' database by recording the nominal number of an individual extracted into a watchlist for on given date, this added functionality is available from October 2018."
With respect, it seems to us that deals with a technical matter and does not govern the question of who can be properly placed on a watchlist.
"3. AFR Locate
The system works by means of a pre-populated Watch List which will contain information and images of subjects and a pre-defined response should these subjects be located by the system.
Watchlists will be both proportionate and necessary for each deployment with the rationale for inclusion detailed pre-event in the AFR Locate deployment report.
Primary factors for consideration for inclusion within a watchlist will be watchlist size, image quality, image provenance and rationale for inclusion.
The numbers of images included within a watchlist cannot exceed 2,000 due to contract restrictions but in any event 1 in 1000 false positive alert rate should not be exceeded.
Children under the age of 18 will not ordinarily feature in a watchlist due the reduced accuracy of the system when considering immature faces.
However, if there is a significant risk of harm to that individual a risk based approach will be adopted and rationale for inclusion evidenced within the deployment report.
The decision for an AFR deployment wherever possible will ultimately be made by the Silver Commander with the DSD project team acting as tactical advisors. Wherever possible the deployment of AFR Locate should be detailed with the Silver Commanders Tactical plan.
If a deployment does not feature a Silver Commander the rationale for deployment will be ratified by the Digital Services Division Inspector and be detailed within the AFR Locate Deployment Report."
"As we are testing the technology South Wales Police have deployed in all event types ranging from high volume music and sporting events to indoor arenas."
That simply underlines the concern that we have in this context. First, it is a descriptive statement and does not lay down any normative requirement as to where deployment can properly take place. Secondly, the range is very broad and without apparent limits. It is not said, for example, that the location must be one at which it is thought on reasonable grounds that people on the watchlist will be present. These documents leave the question of the location simply to the discretion of individual police officers, even if they would have to be of a certain rank (a "Silver Commander"). For the above reasons this appeal will be allowed on Ground 1.
Ground 2: Proportionality
"If an interference with Article 8(1) rights is to be justified it must meet the four-part test in Bank Mellat v Her Majesty's Treasury (No 2) [2014] AC 700, namely:
(1) whether the objective of the measure pursued is sufficiently important to justify the limitation of a fundamental right;
(2) whether it is rationally connected to the objective;
(3) whether a less intrusive measure could have been used without unacceptably compromising the objective; and
(4) whether, having regard to these matters and to the severity of the consequences, a fair balance has been struck between the rights of the individual and the interests of the community.
(See per Lord Sumption at [20]; and especially on question (3), per Lord Reed at [70] to [71] and [75] to [76])."
"Nevertheless, we are satisfied that the use of AFR Locate on 21st December 2017 (Queen's Street) and 27th March 2018 (Motorpoint Arena) struck a fair balance and was not disproportionate. AFR Locate was deployed in an open and transparent way, with significant public engagement. On each occasion, it was used for a limited time, and covered a limited footprint. It was deployed for the specific and limited purpose of seeking to identify particular individuals (not including the Claimant) who may have been in the area and whose presence was of justifiable interest to the police. On the former occasion it led to two arrests. On the latter occasion it identified a person who had made a bomb threat at the very same event the previous year and who had been subject to a (suspended) custodial sentence. On neither occasion did it lead to a disproportionate interference with anybody's Article 8 rights. Nobody was wrongly arrested. Nobody complained as to their treatment (save for the Claimant on a point of principle). Any interference with the Claimant's Article 8 rights would have been very limited. The interference would be limited to the near instantaneous algorithmic processing and discarding of the Claimant's biometric data. No personal information relating to the Claimant would have been available to any police officer, or to any human agent. No data would be retained. There was no attempt to identify the Claimant. He was not spoken to by any police officer."
"Whether, balancing the severity of the measure's effects on the rights of the persons to whom it applies against the importance of the objective, to the extent that the measure would contribute to its achievement, the former outweighs the latter (i.e. whether the impact of the rights infringement is disproportionate to the likely benefit of the impugned measure)." (Emphasis added)
"… On at least two occasions, in December 2017 and March 2018, the Claimant was targeted by the Defendant's use of AFR. Through this claim he challenges:
(i) the unlawful use of this technology against him on both occasions, and
(ii) the Defendant's ongoing use of AFR in public places in the police area in which he resides, giving rise to a clear risk of the technology again being used against him." (Emphasis added)
"It is the Claimant's case that he has twice been the subject of the Defendant's use of AFR technology …"
Ground 3: Compliance with section 64 of the DPA 2018
"If it is apparent that a data controller has approached its task on a footing that is demonstrably false, or in a manner that is clearly lacking, then the conclusion should be that there has been a failure to meet the section 64 obligation".
Ground 4: Compliance with section 42 of the DPA 2018
Ground 5: Public Sector Equality Duty
"A public authority must, in the exercise of its functions, have due regard to the need to—
(a) eliminate discrimination, harassment, victimisation and any other conduct that is prohibited by or under this Act;
(b) advance equality of opportunity between persons who share a relevant protected characteristic and persons who do not share it;
(c) foster good relations between persons who share a relevant protected characteristic and persons who do not share it."
"The Divisional Court was wrong to hold that the Respondent complied with the Public Sector Equality Duty in section 149 of the Equality Act 2010 in circumstances in which the Respondent's Equality Impact Assessment was obviously inadequate and based on an error of law (by failing to recognise the risk of indirect discrimination) and the Respondent's subsequent approach to assessing possible indirect discrimination arising from the use of AFR is flawed."
"In our view, on the facts of this case there is an air of unreality about the Claimant's contention. There is no suggestion that as at April 2017 when the AFR Locate trial commenced, SWP either recognised or ought to have recognised that the software it had licensed might operate in a way that was indirectly discriminatory. Indeed, even now there is no firm evidence that the software does produce results that suggest indirect discrimination. Rather, the Claimant's case rests on what is said by Dr Anil Jain, an expert witness. In his first statement dated 30th September 2018, Dr Jain commented to the effect that the accuracy of AFR systems generally could depend on the dataset used to 'train' the system. He did not, however, make any specific comment about the dataset used by SWP or about the accuracy of the NeoFace Watch software that SWP has licensed. Dr Jain went no further than to say that if SWP did not know the contents of the dataset used to train its system 'it would be difficult for SWP to confirm whether the technology is in fact biased'. The opposite is, of course, also true."
"156. Thus, SWP may now, in light of the investigation undertaken to date by Mr. Edgell, wish to consider whether further investigation should be done into whether the NeoFace Watch software may produce discriminatory impacts. When deciding whether or not this is necessary it will be appropriate for SWP to take account that whenever AFR Locate is used there is an important failsafe: no step is taken against any member of the public unless an officer (the systems operator) has reviewed the potential match generated by the software and reached his own opinion that there is a match between the member of the public and the watchlist face.
157. Yet this possibility of future action does not make good the argument that to date, SWP has failed to comply with the duty under section 149(1) of the Equality Act 2010. Our conclusion is that SWP did have the due regard required when in April 2017 it commenced the trial of AFR Locate. At that time, there was no specific reason why it ought to have been assumed it was possible that the NeoFace Watch software produced more or less reliable results depending on whether the face was male or female, or white or minority ethnic. As we have explained, even now there is no particular reason to make any such assumption. We note that although Dr Jain states that 'bias has been found to be a feature of common AFR systems' he does not provide an opinion on whether, or the extent to which, such bias can be addressed by the fail-safe, such as ensuring that a human operator checks whether there is in fact a match.
158. In our view, the April 2017 Equality Impact Assessment document demonstrates that due regard was had by SWP to the section 149(1) criteria. The Claimant's contention that SWP did not go far enough in that it did not seek to equip itself with information on possible or potential disparate impacts, based on the information reasonably available at that time, is mere speculation. In any event, as matters had developed in the course of the trial since April 2017, it is apparent from Mr. Edgell's evidence that SWP continues to review events against the section 149(1) criteria. This is the approach required by the public-sector equality duty in the context of a trial process. For these reasons, the claim made by reference to section 149(1) of the Equality Act 2010 fails."
(1) The PSED must be fulfilled before and at the time when a particular policy is being considered.
(2) The duty must be exercised in substance, with rigour, and with an open mind. It is not a question of ticking boxes.
(3) The duty is non-delegable.
(4) The duty is a continuing one.
(5) If the relevant material is not available, there will be a duty to acquire it and this will frequently mean that some further consultation with appropriate groups is required.
(6) Provided the court is satisfied that there has been a rigorous consideration of the duty, so that there is a proper appreciation of the potential impact of the decision on equality objectives and the desirability of promoting them, then it is for the decision-maker to decide how much weight should be given to the various factors informing the decision.
"The first of the modern equality duties was found again in section 71 of the RRA, but following amendments made to it by the Race Relations (Amendment) Act 2000 (enacting the General Race Equality Duty). The General Race Equality Duty in the amended section 71 of the RRA required that listed public authorities had 'due regard' to the need 'to eliminate unlawful racial discrimination' and 'to promote equality of opportunity and good relations between persons of different racial groups'. The Race Relations (Amendment) Act 2000 and the General Race Equality Duty within it were enacted to give effect to the recommendations in the Stephen Lawrence Inquiry Report and the Inquiry's findings of 'institutional racism'. The purpose of the General Race Equality Duty was to create a strong, effective, and enforceable legal obligation which placed race equality at the heart of the public authority's decision making. The new duty was intended to mark a major change in the law. It represented a move from a fault-based scheme where legal liability rested only with those who could be shown to have committed one or other of the unlawful acts. Instead, the duty-bearer, the public authority, was to be required to proactively consider altering its practices and structures to meet this statutory duty. This was considered important in light of the findings of the Stephen Lawrence Inquiry." (Emphasis added)
"It is the clear purpose of section 71 [the predecessor to section 149] to require public bodies … to give advance consideration to issues of race discrimination before making any policy decision that may be affected by them. This is a salutary requirement, and this provision must be seen as an integral and important part of the mechanisms for ensuring the fulfilment of the aims of anti-discrimination legislation. …"
"From my experience and the information available to me, I have seen no bias based on either gender or ethnicity. …"
"To minimise any impact of bias as a result of gender, the NeoFace Algorithm training data set contains roughly equal quantities of male and female faces."
At para. 24 he states that the NeoFace Algorithm training data includes a wide spectrum of different ethnicities and has been collected from sources in regions of the world to ensure a comprehensive and representative mix. He states that great care, effort and cost is incurred by NEC, as a socially responsible major corporation, to ensure that this is the case.
Procedural matters
Conclusion
1) The Respondent's use of Live Automated Facial Recognition technology on 21 December 2017 and 27 March 2018 and on an ongoing basis, which engaged Article 8(1) of the European Convention on Human Rights, was not in accordance with the law for the purposes of Article 8(2).
2) As a consequence of the declaration set out in paragraph 1 above, in respect of the Respondent's ongoing use of Live Automated Facial Recognition technology, its Data Protection Impact Assessment did not comply with section 64(3)(b) and (c) of the Data Protection Act 2018.
3) The Respondent did not comply with the Public Sector Equality Duty in section 149 of the Equality Act 2010 prior to or in the course of its use of Live Automated Facial Recognition technology on 21 December 2017 and 27 March 2018 and on an ongoing basis.
Legislation
Data Protection Act 1998 ("DPA 1998")
"… data which relate to a living individual who can be identified (a) from those data, or (b) from those data and other information which is in the possession of, or is likely to come into the possession of, the data controller".
"… obtaining, recording or holding the information or data or carrying out any operation or set of operations on the information or data" [with a range of non-exhaustive examples given].
"… the duty of a data controller to comply with the data protection principles in relation to all personal data with respect to which he is the data controller" [subject to section 27(1) concerning the exemptions].
(1) Principle 1 is that personal data shall be processed fairly and lawfully and, in particular, shall not be "processed" at all unless it is necessary for a relevant purpose (referred to in Schedule 2 below). In the case of the police, the relevant purposes are the administration of justice and the exercise of any other function of a public nature exercised in the public interest.
(2) Principle 2 is that personal data may be obtained only for lawful purposes and may not be further "processed" in a manner incompatible with those purposes.
(3) Principle 3 is that the data must be "adequate, relevant and not excessive" for the relevant purpose.
(4) Principle 4 is that data shall be accurate and, where necessary, kept up to date.
(5) Principle 5 is that the data may not be kept for longer than is necessary for those purposes.
(6) Principle 6 is that personal data shall be processed in accordance with the rights of data subjects under this Act.
(7) Principle 7 is that proper and proportionate technical and organisational measures must be taken against the unauthorised or unlawful "processing" of the data.
(8) Principle 8 is that personal data shall not be transferred outside the European Economic Area unless the country ensures an adequate level of protection.
"1. The data subject has given his consent to the processing.
…
5. The processing is necessary—
(a) for the administration of justice,
(b) for the exercise of any functions conferred on any person by or under any enactment,
(c) for the exercise of any functions of the Crown, a Minister of the Crown or a government department, or
(d) for the exercise of any other functions of a public nature exercised in the public interest by any person."
Protection of Freedoms Act 2012 ("PFA 2012")
"(a) closed circuit television or automatic number plate recognition systems,
(b) any other systems for recording or viewing visual images for surveillance purposes,
(c) any systems for storing, receiving, transmitting, processing or checking images or information obtained by systems falling within paragraph (a) or (b), or
(d) any other systems associated with, or otherwise connected with, systems falling within paragraph (a), (b) or (c)." (emphasis added)
"(1) A relevant authority must have regard to the surveillance camera code when exercising any functions to which the code relates.
(2) A failure on the part of any person to act in accordance with any provision of the surveillance camera code does not of itself make that person liable to criminal or civil proceedings.
(3) The surveillance camera code is admissible in evidence in any such proceedings.
(4) A court or tribunal may, in particular, take into account a failure by a relevant authority to have regard to the surveillance camera code in determining a question in any such proceedings." (emphasis added)
"(a) Encouraging compliance with the surveillance camera code;
(b) Reviewing the operation of the code; and
(c) Providing advice about the code (including changes to it or breaches of it)."
Data Protection Act 2018 ("DPA 2018")
PART 3 LAW ENFORCEMENT PROCESSING
"29 Processing to which this Part applies
(1) This Part applies to—
(a) the processing by a competent authority of personal data wholly or partly by automated means, and
(b) the processing by a competent authority otherwise than by automated means of personal data which forms part of a filing system or is intended to form part of a filing system.
(2) Any reference in this Part to the processing of personal data is to processing to which this Part applies. …"
"34 Overview and general duty of controller
(1) This Chapter sets out the six data protection principles as follows—
(a) section 35(1) sets out the first data protection principle (requirement that processing be lawful and fair);
(b) section 36(1) sets out the second data protection principle (requirement that purposes of processing be specified, explicit and legitimate);
(c) section 37 sets out the third data protection principle (requirement that personal data be adequate, relevant and not excessive);
(d) section 38(1) sets out the fourth data protection principle (requirement that personal data be accurate and kept up to date);
(e) section 39(1) sets out the fifth data protection principle (requirement that personal data be kept for no longer than is necessary);
(f) section 40 sets out the sixth data protection principle (requirement that personal data be processed in a secure manner).
(2) In addition—
(a) each of sections 35, 36, 38 and 39 makes provision to supplement the principle to which it relates, and
(b) sections 41 and 42 make provision about the safeguards that apply in relation to certain types of processing.
(3) The controller in relation to personal data is responsible for, and must be able to demonstrate, compliance with this Chapter."
"35 The first data protection principle
(1) The first data protection principle is that the processing of personal data for any of the law enforcement purposes must be lawful and fair.
(2) The processing of personal data for any of the law enforcement purposes is lawful only if and to the extent that it is based on law and either—
(a) the data subject has given consent to the processing for that purpose, or
(b) the processing is necessary for the performance of a task carried out for that purpose by a competent authority.
(3) In addition, where the processing for any of the law enforcement purposes is sensitive processing, the processing is permitted only in the two cases set out in subsections (4) and (5).
(4) The first case is where—
(a) the data subject has given consent to the processing for the law enforcement purpose as mentioned in subsection (2)(a), and
(b) at the time when the processing is carried out, the controller has an appropriate policy document in place (see section 42).
(5) The second case is where—
(a) the processing is strictly necessary for the law enforcement purpose,
(b) the processing meets at least one of the conditions in Schedule 8, and
(c) at the time when the processing is carried out, the controller has an appropriate policy document in place (see section 42).
(6) The Secretary of State may by regulations amend Schedule 8—
(a) by adding conditions;
(b) by omitting conditions added by regulations under paragraph (a).
(7) Regulations under subsection (6) are subject to the affirmative resolution procedure.
(8) In this section, "sensitive processing" means—
(a) the processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs or trade union membership;
(b) the processing of genetic data, or of biometric data, for the purpose of uniquely identifying an individual;
(c) the processing of data concerning health;
(d) the processing of data concerning an individual's sex life or sexual orientation."
"Article 10 Processing of special categories of personal data
Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation shall be allowed only where strictly necessary, subject to appropriate safeguards for the rights and freedoms of the data subject, and only…"
Definitions
"…any information relating to an identified or identifiable living individual", which means an individual "who can be identified, directly or indirectly, in particular by reference to—(a) an identifier such as a name, an identification number, location data or an online identifier, or (b) one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of the individual".
"…the processing of… biometric data… for the purpose of uniquely identifying an individual."
"…personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of an individual, which allows or confirms the unique identification of that individual, such as facial images or dactyloscopic data".
Conditions
"1. Statutory etc purposes
This condition is met if the processing-
(a) is necessary for the exercise of a function conferred on a person by an enactment or rule of law, and
(b) is necessary for reasons of substantial public interest.
2. Administration of justice
This condition is met if the processing is necessary for the administration of justice.
…
6. Legal claims
This condition is met if the processing-
(a) is necessary for the purpose of, or in connection with, any legal proceedings (including prospective legal proceedings)…"
"42 Safeguards: sensitive processing
(1) This section applies for the purposes of section 35(4) and (5) (which require a controller to have an appropriate policy document in place when carrying out sensitive processing in reliance on… a condition specified in Schedule 8).
(2) The controller has an appropriate policy document in place in relation to the sensitive processing if the controller has produced a document which—
(a) explains the controller's procedures for securing compliance with the data protection principles (see section 34(1)) in connection with sensitive processing in reliance on the consent of the data subject or (as the case may be) in reliance on the condition in question, and
(b) explains the controller's policies as regards the retention and erasure of personal data processed in reliance on the consent of the data subject or (as the case may be) in reliance on the condition in question, giving an indication of how long such personal data is likely to be retained.
(3) Where personal data is processed on the basis that an appropriate policy document is in place, the controller must during the relevant period—
(a) retain the appropriate policy document,
(b) review and (if appropriate) update it from time to time, and
(c) make it available to the Commissioner, on request, without charge.
(4) The record maintained by the controller under section 61(1) and, where the sensitive processing is carried out by a processor on behalf of the controller, the record maintained by the processor under section 61(3) must include the following information—
(a) …which condition in Schedule 8 is relied on,
(b) how the processing satisfies section 35 (lawfulness of processing), and
(c) whether the personal data is retained and erased in accordance with the policies described in subsection (2)(b) and, if it is not, the reasons for not following those policies.
(5) In this section, "relevant period", in relation to sensitive processing …in reliance on a condition specified in Schedule 8, means a period which—
(a) begins when the controller starts to carry out the sensitive processing …in reliance on that condition, and
(b) ends at the end of the period of 6 months beginning when the controller ceases to carry out the processing."
"Data protection impact assessment
(1) Where a type of processing is likely to result in a high risk to the rights and freedoms of individuals, the controller must, prior to the processing, carry out a data protection impact assessment.
(2) A data protection impact assessment is an assessment of the impact of the envisaged processing operations on the protection of personal data.
(3) A data protection impact assessment must include the following—
(a) a general description of the envisaged processing operations;
(b) an assessment of the risks to the rights and freedoms of data subjects;
(c) the measures envisaged to address those risks;
(d) safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Part, taking into account the rights and legitimate interests of the data subjects and other persons concerned.
(4) In deciding whether a type of processing is likely to result in a high risk to the rights and freedoms of individuals, the controller must take into account the nature, scope, context and purposes of the processing."
Code and Guidance
Secretary of State's Surveillance Camera Code of Practice
"1. Use of a surveillance camera system must always be for a specified purpose which is in pursuit of a legitimate aim and necessary to meet an identified pressing need.
2. The use of a surveillance camera system must take into account its effect on individuals and their privacy, with regular reviews to ensure its use remains justified.
3. There must be as much transparency in the use of a surveillance camera system as possible, including a published contact point for access to information and complaints.
4. There must be clear responsibility and accountability for all surveillance camera system activities including images and information collected, held and used.
5. Clear rules, policies and procedures must be in place before a surveillance camera system is used, and these must be communicated to all who need to comply with them.
6. No more images and information should be stored than that which is strictly required for the stated purpose of a surveillance camera system, and such images and information should be deleted once their purposes have been discharged.
7. Access to retained images and information should be restricted and there must be clearly defined rules on who can gain access and for what purpose such access is granted; the disclosure of images and information should only take place when it is necessary for such a purpose or for law enforcement purposes.
8. Surveillance camera system operators should consider any approved operational, technical and competency standards relevant to a system and its purpose and work to meet and maintain those standards.
9. Surveillance camera system images and information should be subject to appropriate security measures to safeguard against unauthorised access and use.
10. There should be effective review and audit mechanisms to ensure legal requirements, policies and standards are complied with in practice, and regular reports should be published.
11. When the use of a surveillance camera system is in pursuit of a legitimate aim, and there is a pressing need for its use, it should then be used in the most effective way to support public safety and law enforcement with the aim of processing images and information of evidential value.
12. Any information used to support a surveillance camera system which compares against a reference database for matching purposes should be accurate and kept up to date."
"1.8 This code has been developed to address concerns over the potential for abuse or misuse of surveillance by the state in public places."
"2.1 Modern and forever advancing surveillance camera technology provides increasing potential for the gathering and use of images and associated information. These advances vastly increase the ability and capacity to capture, store, share and analyse images and information. This technology can be a valuable tool in the management of public safety and security, in the protection of people and property, in the prevention and investigation of crime, and in bringing crimes to justice. Technological advances can also provide greater opportunity to safeguard privacy. Used appropriately, current and future technology can and will provide a proportionate and effective solution where surveillance is in pursuit of a legitimate aim and meets a pressing need."
"2.2 In general, any increase in the capability of surveillance camera system technology also has the potential to increase the likelihood of intrusion into an individual's privacy. The Human Rights Act 1998 gives effect in UK law to the rights set out in the European Convention on Human Rights (ECHR). Some of these rights are absolute, whilst others are qualified, meaning that it is permissible for the state to interfere with the right provided that the interference is in pursuit of a legitimate aim and the interference is proportionate. Amongst the qualified rights is a person's right to respect for their private and family life, home and correspondence, as provided for by Article 8 of the ECHR."
"2.3 That is not to say that all surveillance camera systems use technology which has a high potential to intrude on the right to respect for private and family life. Yet this code must regulate that potential, now and in the future. In considering the potential to interfere with the right to privacy, it is important to take account of the fact that expectations of privacy are both varying and subjective. In general terms, one of the variables is situational, and in a public place there is a zone of interaction with others which may fall within the scope of private life. An individual can expect to be the subject of surveillance in a public place as CCTV, for example, is a familiar feature in places that the public frequent. An individual can, however, rightly expect surveillance in public places to be both necessary and proportionate, with appropriate safeguards in place."
"2.4 The decision to use any surveillance camera technology must, therefore, be consistent with a legitimate aim and a pressing need. Such a legitimate aim and pressing need must be articulated clearly and documented as the stated purpose for any deployment. The technical design solution for such a deployment should be proportionate to the stated purpose rather than driven by the availability of funding or technological innovation. Decisions over the most appropriate technology should always take into account its potential to meet the stated purpose without unnecessary interference with the right to privacy and family life. Furthermore, any deployment should not continue for longer than necessary."
"3.2.3 Any use of facial recognition or other biometric characteristic recognition systems needs to be clearly justified and proportionate in meeting the stated purpose, and be suitably validated4. It should always involve human intervention before decisions are taken that affect an individual adversely." (Footnote 4: "The Surveillance Camera Commissioner will be a source of advice on validation of such systems.")
"4.8.1 Approved standards may apply to the system functionality, the installation and the operation and maintenance of a surveillance camera system. These are usually focused on typical CCTV installations, however there may be additional standards applicable where the system has specific advanced capability such as ANPR, video analytics or facial recognition systems, or where there is a specific deployment scenario, for example the use of body-worn video recorders."
"4.12.1 Any use of technologies such as ANPR or facial recognition systems which may rely on the accuracy of information generated elsewhere such as databases provided by others should not be introduced without regular assessment to ensure the underlying data is fit for purpose."
"4.12.2 A system operator should have a clear policy to determine the inclusion of a vehicle registration number or a known individual's details on a reference database associated with such technology. A system operator should ensure that reference data is not retained for longer than necessary to fulfil the purpose for which it was originally added to a database."
Surveillance Camera Commissioner's AFR Guidance
SWP Documents
SWP Policy Document
"3. Compliance with Data Protection Principles
a) 'lawfulness and fairness'
The lawfulness of South Wales Police processing is derived from its official functions as a UK police service, which includes the investigation and detection of crime and the apprehension of offenders, including acting in obedience to court warrants that require the arrest of defendants who have failed to attend court.
b) 'data minimisation'
South Wales police only processes sensitive personal data when permitted to do so by law. Such personal data is collected for explicit and legitimate purposes such as biometric data during the deployment of Automatic Facial Recognition technology.
c) 'accuracy'
During AFR Locate deployments South Wales Police collects the information necessary to determine whether the individual is on a watchlist. If an intervention is made the process will not prompt data subjects to answer questions and provide information that is not required.
Where processing is for research and analysis purposes, wherever possible this is done using anonymised or de-identified data sets.
d) 'storage limitation'
Providing complete and accurate information is required when constructing a watchlist. During AFR Locate deployments watchlists will be constructed on the day of deployment and where the deployments extend beyond 24 hours these will be amended daily. Where permitted by law and when it is reasonable and proportionate to do so, South Wales Police may check this information with other organisations – for example other police and law enforcement services. If a change is reported by a data subject to one service or a part of South Wales Police, whenever possible this is also used to update the AFR application, both to improve accuracy and avoid the data subject having to report the same information multiple times.
e) 'integrity and confidentiality'
South Wales Police has a comprehensive set of retention policies in place which are published online, further information specific to AFR can be found on SWP AFR webpage.
All staff handling South Wales Police information are security cleared and required to complete annual training on the importance of security, and how to handle information appropriately.
In addition to having security guidance and policies embedded throughout SWP business, SWP also has specialist security, cyber and resilience staff to help ensure that information is protected from risks of accidental or unlawful destruction, loss, alteration, unauthorised disclosure or access."
SWP Standard Operating Procedures ("SOP")
(1) A stipulation that watchlists should be "proportionate and necessary" for each deployment and primary factors for the inclusion on watchlists include will be "watchlist size, image quality, image provenance and rationale for inclusion".
(2) The numbers of images included within a watchlist cannot exceed 2,000 due to contract restrictions "but in any event1 in 1000 false positive alert rate should not be exceeded".
(3) Children under the age of 18 will not normally feature in a watchlist due to "the reduced accuracy of the system when considering immature faces".
(4) The decision for an AFR deployment wherever possible will ultimately be made by the Silver Commander.
(5) The rationale for the deployment of AFR is to be recorded in a pre-deployment report.
(6) Signs advertising the use of the technology are to be deployed to ensure that where possible an individual is aware of the deployment before their image is captured.
(7) Interventions are not to be made on the basis of a similarity score alone and when an intervention is made intervention officer will establish the identity of the individual by traditional policing methods.
(8) Details of the retention of different types of information gathered during an AFR deployment.
SWP Operational Advice
BEFORE: THE MASTER OF THE ROLLS, THE PRESIDENT OF THE QUEEN'S BENCH DIVISION, AND LORD JUSTICE SINGH
UPON hearing (by way of video link) Dan Squires QC and Aidan Wills of Counsel for the Appellant, Jason Beer QC and Francesca Whitelaw of Counsel for the Respondent, Gerry Facenna QC and Eric Metcalfe of Counsel for the Information Commissioner, Richard O'Brien of Counsel and Thomas Yarrow of Counsel for the Secretary State for the Home Department, and Andrew Sharland QC and Stephen Kosmin of Counsel for the Surveillance Camera Commissioner
AND UPON receiving written submissions from Fiona Barton QC for the Police and Crime Commissioner for South Wales
AND UPON the parties having agreed that each shall bear their own costs
AND UPON the handing down of a judgment on 11 August 2020
IT IS ORDERED THAT:
AND IT IS DECLARED THAT: