Intellectual Property, Information Technology & Cybersecurity

Facial Recognition in Public Spaces Raises Data Protection Concerns

You are walking along a side street towards your office. Unbeknown to you a private company has installed closed circuit television with facial recognition capabilities and is tracking your movements. Is this lawful?

On 15 August 2019, the Information Commissioner, Elizabeth Denham released a statement which may address this very issue.

Ms Denham advised that she would be investigating the use of facial recognition technology in King’s Cross London by a privately-owned development. Ms Denham states that she is “deeply concerned about the growing use of facial recognition in public spaces, not only by law enforcement agencies but also increasingly by the private sector.”

The investigation follows a previous blog by Ms Denham in July that live facial recognition software was a high priority area for the ICO and that her office was conducting an investigation into and monitoring trials by the South Wales and Met Police.

Live facial recognition technology is different to CCTV monitoring. By using biometrics (certain physical and physiological features), the technology can map facial features to identify particular individuals by matching these with a database of known faces. This technology has been in use for some years by certain public and government agencies but with the advent of AI and machine learning, it has become more prevalent in the private sector.  

Facial Recognition Concerns

Whilst the privacy legal framework for law enforcement is different to that for private companies, the privacy concerns about the use of facial recognition software in public spaces remain the same.  

Some threats to privacy include:

  • Lack of Transparency – An intrusion into the private lives of members of the public who had not consented to or were aware of the collection or the purposes for which they were collected/stored
  • Misuse - Images retrieved may be used for purposes other than that those consented to or notified.
  • Accuracy - inherent technological bias within the technology may result in false positive matches or discrimination.
  • Automated Decision making – decisions which may significantly affect individuals may be based solely on the facial recognition software.

Processing of Personal Data by Private Companies

Organisations which process personal data within the UK must comply with the General Data Protection Act (GDPR) and the Data Protection Act 2018.

Processing of personal data must only be undertaken if any of the grounds under Article 6 of the GDPR apply. Further where the processing involves special category data, which includes biometric data, a further justification must be found within Article 9.

Article 6 (1) lists the lawful bases for processing. In the context of video surveillance, the applicable bases includes:

- the consent of the individuals concerned (Article 6(1) (a). For consent to be valid under the GDPR it must be freely given, specific, informed and unambiguously given prior to the processing. 

- necessary for the legitimate interests pursued by the controller. (Article 6 (1)(f). Processing for the purpose of ‘legitimate interest of a controller’ will be lawful unless such interests are overridden by the fundamental rights and freedoms of an individual.

Facial Recognition may be processing Special Category Data 

If the processing of personal data involves special category data, in addition to identifying a lawful basis under Article, 6, an exemption must also be found in Article 9 to justify such processing.

Facial recognition technology will collect a type of special category data, biometric data if it is capable of uniquely identifying an individual. Biometric data involves the physical, physiological or behavioural characteristics of a person.

Therefore, if facial recognition technology is used to identify a particular individual as opposed to a category of persons (such as the profiling of customers by race, gender, age) this will be processing biometric data.

Article 9(2) of the GDPR lists a of limited number of exemptions which may justify processing special category data. These grounds include with the explicit consent of the data subject and other various grounds including vital interests of the data subject (immediate medical emergency), necessary for the establishment, exercise or defence of legal claims, processing relates to personal data already made public by the individual, substantial public interest, various medical and public health reasons and for scientific research or statistics.

The European Data Protection Board (EDPB) recently issued draft guidelines for public consultation, Guidelines 3/2019 on processing of personal data through video devices. The draft guidelines specifically address the use of facial recognition technology.

The EDPB is an independent European body established under the GDPR and publishes guidance on the application of European data protection laws.

The draft guidelines make some interesting observations about both the use of CCTV and facial recognition:

  1. Video surveillance may be necessary to protect the legitimate interests of a controller such as for the protection of property against burglary, theft or vandalism (Para 19)
  2. Video surveillance measures should only be chosen if the purpose could not reasonably be fulfilled by other means which are less intrusive to the fundamental rights and freedoms (Para 24)
  3. It may be necessary to use video surveillance not just within the property boundaries but in some cases may include the immediate surroundings of the premises in which case some protective measures such as blocking out or pixelating could be employed (Para 27)
  4. In respect of facial recognition, the draft guidelines voice caution. The EDPB appears to suggest that whilst other exemptions may arguably be available for processing of special category data, in the context of private organisations, explicit consent may in most cases be required. (Para 76)
  5. Where explicit consent is required, an organisation cannot condition access to its services on consenting to the processing but must offer an alternative solution that does not involve facial recognition (para 85).
  6. In cases where the technology captures passers-by, an exemption under Article 9 will still be required for these individuals (para 83). The difficulty in the case of passers-by is that consent must be obtained before undertaking processing and therefore either another exemption under Article 9 must apply or such processing may be unlawful.

If the conclusions of the ICO’s investigation into the Kings Cross matter reflect the views of the EDPB (explicit consent likely to be required for facial recognition), the flow on affects for companies could be widespread.  Certainly, companies which use facial recognition on individuals in public areas ought to be now reviewing their data protection compliance and procedures.

For advice and guidance on this or any data protection concerns, please contact Chrysilla de Vere, head of Clarkslegal’s Data Protection Team.

Link: https://www.clarkslegal.com/Blog/Post/Facial_Recognition_in_Public_Spaces_Raises_Data_Protection_Concerns

< Back