The Office of the Privacy Commissioner (OPC) released a report (Report) on Wednesday setting out its findings in relation to a recent trial of facial recognition technology (FRT) by a large New Zealand grocery company (Trial).
The Report found that, although any use of FRT requires "strong justifications and careful system design to ensure all appropriate privacy safeguards are built in", the live FRT operating model deployed during the Trial complied with the Privacy Act 2020 (Act).
Key takeaways for other organisations
The Report emphasised that the OPC's finding that the FRT deployed in the Trial was compliant with the Act was "not a green light for more general use of FRT". Nonetheless, the Report provides useful guidance that organisations ought to bear in mind when implementing higher risk data processing technologies, such as FRT, including the following key principles:
- Privacy-first operating model: The operating model (being the technology, policies, communications, staff training, handling of complaints etc.) for any FRT must be designed in a way that either reduces privacy risk to an acceptable level or removes privacy risk altogether. Privacy protections that may be appropriate for one use of FRT will not necessarily be appropriate for another. Each new use of FRT must be considered on its own merits and carefully justified in an organisation's individual context.
- Have a clear purpose: The purpose for using FRT should be specific, clearly defined and justifiable and the FRT system implemented must be necessary to effectively address that purpose. Key questions to consider include:
- What is the specific problem you are trying to solve (your purpose) and how serious is the problem?
- What options are available outside of FRT and how do they compare from an effectiveness perspective? For example, are there less intrusive measures which could be implemented instead?
- Is the use of FRT a proportionate way to address the relevant problem?
- Select FRT carefully: Any FRT systems used must be "fit for purpose". Organisations should consider both whether the technology works at a sufficiently high-quality level to achieve its purpose and whether it has been trained on an appropriate dataset to reduce the risk of bias.
- Keep FRT data only for as long as necessary: Organisations should assess how long FRT images reasonably need to be retained to serve its purpose. In the Trial, images that did not trigger a match against the store's watchlist were deleted almost instantly (i.e. within 59 seconds) and images which were matches but for which no action was taken were deleted by midnight on the same day. As soon as images can be deleted, they should be.
- Put appropriate safeguards and checks in place: For example, in the Trial, each alert received was reviewed by two trained staff members who then made a decision whether to intervene or call the Police, and images of children or young people under 18 or those who were vulnerable were not to be added to watchlists.
- Carry out a Privacy Impact Assessment and keep it up to date: The OPC considers an initial PIA to be an "essential tool" for identifying and managing risks. A PIA is required at the outset but should also be updated over time to take into account any changes made to the FRT.
- Training staff: Staff training should be provided initially and refreshed periodically. The Report recommends that any such training uses real-life examples wherever possible.
- Ensure transparency: There must be a reasonable degree of transparency with data subjects that FRT is operating. Exactly how transparency ought to be achieved will differ according to the particular use case. In the Trial, this included clear signage (in A1/A0 at the store entrance and more signs in store), information displayed on the relevant website, and staff trained to answer FRT-related questions.
- Complaints and requests process: Use of FRT must be accompanied by an efficient complaints and requests process which allows those identified in images to make complaints if they are misidentified, or to have any erroneous information about them corrected or removed.
- Testing your FRT: The Report recommends testing FRT pre-deployment and over time to ensure that it continues to achieve your purpose in a way that minimises privacy risk. The more serious the effect on individuals, the higher the level of care required to ensure that the information obtained remains fit for purpose.
- Continue to review any use of FRT: Linked to the above point, even if FRT is effective at the outset, organisations will need to ensure that the FRT continues to be effective and justified over time. Good recordkeeping will help to continue to monitor the effectiveness of the FRT.
The OPC noted that it does not expect every organisation to trial FRT prior to adoption in a manner similar to the Trial, but the OPC's recommendation is that if organisations don't run a trial, they still ought to test the FRT pre-deployment and over time and make any necessary adjustments to ensure the system operates effectively and safely.
Biometrics Code
Further guidance on the use of FRT will be available with the OPC's release of the Biometric Processing Privacy Code (Biometrics Code) which is expected in mid-2025. The Biometrics Code will regulate the collection and use of biometric information using automated processing, which would include an image of someone's face in a FRT system. For further background on the Biometrics Code, please see our related Insights here.
We will continue to monitor guidance around FRT, the Biometrics Code and other related developments, as well as the OPC's activities in this space. If you would like to discuss the implementation of FRT within your organisation or how the Biometrics Code may impact your organisation, please get in touch with one of our experts listed below.