3 minutes read

Biometric data and facial recognition technology

The use of biometric data and in particular the adoption of facial recognition technology (FRT) are increasingly being marketed to Universities and other Higher Education Institutions (HEIs) for a number of applications aimed at enhancing security, convenience and operational efficiency. Potential uses of FRT include campus security and access control - for example using FRT to restrict access to sensitive areas like research labs, IT servers or residential accommodation. By scanning a student or staff member's face the system can verify identity and grant or deny access to specific facilities without the need for ID cards or passcodes. Other potential uses of FRT in HEIs are to verify student identities during online or in-person exams and to monitor attendance in lectures. 

The use of facial recognition in HEIs does, however, raise significant data protection issues.  FRT collects vast amounts of biometric data, which is sensitive and highly personal. Using facial recognition without adequate safeguards can lead to privacy risks, including unauthorised data access and profiling.  Misuse or breaches of this data could result in non-compliance with UK GDPR leading to severe penalties, claims by individuals and reputational harm. Therefore, HEIs must ensure they comply with the relevant data protection principles, including transparency, having a lawful and fair basis for processing and implementing appropriate security measures and safeguards when deploying facial recognition systems. A data protection impact assessment (DPIA) should also be undertaken. An institution would also need to consider whether less intrusive approaches could be taken to achieve the intended purpose.

Regulatory decisions

Recent decisions by the Information Commissioner's Office (ICO) highlight ongoing concerns and scrutiny over the use FRT particularly in relation to data protection and privacy rights. 

In 2023 the ICO cleared the use of Facewatch's live facial recognition technology, particularly in retail settings, as compliant with the UK's data protection laws.  Facewatch's system, which is used to prevent shoplifting, was challenged by privacy advocacy groups for being overly intrusive. However, the ICO concluded that the technology adhered to legal standards emphasizing that its use was proportionate to the security risks it sought to address.  Despite this approval, privacy concerns remain particularly over transparency and the potential for misuse.

However, in February 2024, in the case of Serco Leisure, the ICO found the use of facial recognition and fingerprint scanning to monitor over 2,000 employees' attendance unlawful.  The ICO ruled that Serco failed to justify why such intrusive methods were necessary, especially when less invasive options were available, like ID cards or fobs.  Additionally, employees were not offered alternatives to biometric data collection, creating a coercive environment. The ICO issued a formal reprimand and an enforcement notice which required Serco to destroy unlawfully collected biometric data within three months.   

In July 2024 Chelmer Valley High School was reprimanded by the ICO for using FRT in its canteen for payments without a DPIA, a legal requirement under UK GDPR when undertaking processing likely to result in a high risk to individuals’ rights and freedoms.  The school also failed to consult its Data Protection officer or adequately inform students and parents. The ICO stressed that deploying FRT, especially involving children, requires thorough risk assessments to protect individual's rights and privacy.

The ICO has also flagged potential concerns about the broader implementation of facial recognition in public spaces, especially regarding how sensitive biometric data is collected and processed. The ICO's guidance on biometric data emphasizes the need for clear governance, security, accountability, strict necessity and proportionality in the use of such technologies.  Individuals should be informed clearly and transparently about how their biometric data will be used and an appropriate lawful basis and conditions for processing biometric data must be in place.

Exercise caution

The balancing act between security, convenience and privacy will continue to be a critical issue as FRT becomes more widely adopted and the technology in this space is developing rapidly. 

In the higher education sector these developments suggest that institutions considering facial recognition must be particularly cautious, ensuring compliance with stringent data protection rules while addressing the privacy concerns of staff, students and the wider community.  HEIs should ensure compliance through robust data governance, employee awareness and cybersecurity defences.

Contact

Tamsin Morris

+441612355449

How we can help you

Contact us