Live facial recognition technology in the real estate sector

Overview

Facial recognition technology relies on the process of identifying or verifying a person's identity using their facial features. Software known as facial verification can be used to establish someone’s identity, for instance as a way to unlock electronic devices and to help with passport checks at airports.

In the real estate context, it is also commonly found within entry systems which automatically grant entry for staff and authorised visitors instead of issuing a key fob or a card. These sorts of uses are usually carried out with the subject’s knowledge and usually work to their advantage by speeding up processes and providing enhanced security.

Facial identification systems are most often encountered on social media platforms, where they automatically identify the same face across multiple photos and can link a face to a profile. The police also use facial identification, for instance on still images taken from CCTV or social media, to identify someone against their custody image database.

However, live facial recognition technology (“LFRT”) is more controversial. It works by using facial recognition software to scan real time video footage, detecting faces in a frame and then checking them against set criteria, such as a watchlist. Any matches that clear a pre-set threshold are then ranked and displayed. It functions more like CCTV and has sometimes been used without the knowledge of the data subjects.

Why is LFRT controversial?

Concerns have been mounting about the use of LFRT in public places, for example:

  • Several police forces have used LFRT systems near football matches and music concerts, public demonstrations, and during a royal tour, and the Metropolitan Police states that it “uses LFRT to help tackle serious violence, gun and knife crime, and child sexual exploitation”.
  • LFRT has also been used by private security companies employed by commercial landowners to safeguard their residential, retail and leisure schemes in order to detect and alert police to the presence of certain individuals, or high risk-missing persons.
  • Retailers have used LFRT to identify known shoplifters or people engaged in antisocial behaviour in stores, as well as to anonymously track the movements of customers for marketing purposes and to inform the design of buildings and shop fittings. For example, the systems can show retailers how long each customer spends queuing, dwelling, and travelling in store, and can link these movements to an individual’s account, enabling the retailer to track repeat customers.

These examples raise issues around the potential for some uses of LFRT in public places to i) undermine an individual's privacy; ii) entrench bias, bearing in mind that some systems have varying levels of accuracy according to the subject’s demographic group; and iii) enable corporate and/or public organisations to wield disproportionate power to monitor the population, which could potentially undermine rights such as the freedom of expression and association.

The Information Commissioner and LFRT

The use of LFRT for law enforcement purposes is subject to the UK General Data Protection Regulation ("UK GDPR"), the Data Protection Act 2018, the Human Rights Act 1998, the Equality Act 2010, and the Protection of Freedoms Act 2012. Its use by private sector organisations for marketing purposes is limited by the UK GDPR and the Data Protection Act 2018.

In 2020, in the case of R (on the application of Edward Bridges) v The Chief Constable of South Wales Police [2020] EWCA Civ 1058 the Court of Appeal considered the challenge raised by the civil liberties organisation Liberty against South Wales Police’s use of LFRT in public places. The police were running a long-term trial in which they were deploying surveillance cameras to capture digital images of members of the public, which were then processed and compared with digital images of persons on a police watchlist. The Court agreed that this use of LFRT breached privacy rights, data protection laws and equality laws.

There has therefore been a growing awareness that, while LFRT is still in its infancy, it is important to ensure that it does not expand without due regard for data protection and privacy rights. To this end, the Information Commissioner's Office (the "ICO") recently published a Commissioner's Opinion (the "Opinion") on the use of LFRT. It sets out the rules for the future use of LFRT which are summarised in the box below. In considering any regulatory action or use of her enforcement powers, the Commissioner may refer to the Opinion as a guide to how she interprets and applies the law, and may update or revise the Opinion based on any material legal or practical developments in this evolving area.

What key data protection issues must be considered when using LFRT?

(i) The user must identify a specified, explicit and legitimate purpose for using LFRT in a public place.

  • The purpose must be sufficiently important to justify the processing of the personal data in question.
  • The processing of the personal data must be reasonably necessary. Where alternative measures can be taken, users should be able to demonstrate that they have discounted them for adequate reasons. For example, an occupier that is using LFRT for security purposes must show why the other means to guard against theft are inadequate. Similarly, retailers that use LFRT for marketing or targeted advertising must substantiate why the other possible means of analysing customer needs are not appropriate. They could also consider deploying LFRT during certain limited times to obtain a snapshot of their consumer-base, rather than intaking a constant stream of biometric data.

(ii) The user must identify a valid lawful basis and meet its requirements.

Users of LFRT should be clear as to the grounds relied upon under data protection legislation to process the data that is collected via LFRT. Additional rules apply where biometric data is being processed (which is usually the case with LFRT), which mean that further justification must be provided for processing the data, (and in many cases set out in an appropriate policy document). This is usually that the processing is:

  • in the substantial public interest; required in order to 'prevent or detect unlawful acts', or 'safeguard children and individuals at risk'. The use of LFRT must be necessary in order to achieve this.

(iii) “Data protection by design and default” approach

Users of LFRT must adopt a data protection by design and default approach to their use of LFRT. In this context, users of LFRT must:

  • balance their need to use LFRT as against the risk such use poses to the rights and freedoms of individuals;
  • implement technical and organisational measures so that only the personal data that is necessary and specific to the purpose is processed; and
  • design their LFRT systems, and the ways in which they are used, in accordance with data protection principles such as the principle of minimisation, which requires that the personal data being processed is adequate, relevant and limited to what is necessary in relation to the purposes for which it is processed.

LFRT users are likely to need to complete a data protection impact assessment under Article 35 of the UK GDPR. If a data protection by design and default approach is taken, the assessment will be easier to complete comprehensively as many of the key points will have been assessed (and risks mitigated) during the design stage.

(iv) Transparency

Businesses using LFRT should clearly communicate with the public that their data is being processed, the purpose of processing, and how they can obtain more information about this processing. The business should have clear processes (and staff expertise) to be able to communicate information about how and why LFRT is used, in an accessible way if so requested. As LFRT involves more sensitive data than CCTV, the Opinion recommends that businesses should not merely adapt CCTV signage but rather "consider more extensive and effective measures" to aid public understanding, such as literature on social media platforms and websites.

(v) Accuracy and avoidance of discrimination and bias

Businesses should ensure that their LFRT systems have been robustly tested (including using diverse individuals to test the accuracy) so that, as far as possible, the LFRT is free from statistical inaccuracies and discrimination and bias.

What does this mean for owners and managers of public places?

The real estate sector is live to the advantages that technology can bring, such as systems which monitor the energy efficiency of buildings, smart sensor technology which helps control energy distribution across a building in real time and ensures that power for lighting and heating rooms is only being consumed in parts of the property where someone is present, and the Customer Relationship Management systems that property managers use to manage their relationships and interactions with occupiers.

However, mapping the Opinion onto the real estate sector, it is clear that if a landlord or its property manager is considering operating LFRT in the common parts of its estate, or if an occupier is thinking about installing LFRT in its office blocks or shopping units then they should:

  • complete a data protection impact assessment. As part of this process, they must assess the risks and potential impacts on the interests, rights and freedoms of individuals. This includes any direct or indirect impact on their data protection rights and wider human rights such as freedom of expression, association and assembly. This assessment should be regularly updated, and the Commissioner recommends that it should include an element of public consultation; and
  • carefully evaluate their plans with a rigorous level of scrutiny. The law requires them to demonstrate that their processing can be justified as fair, necessary and proportionate.

Together, these requirements mean that where LFRT is used in public places, there is a high bar for its use to be lawful.

Back to top