Facial recognition technology: Is there a path forward for the Australian retail sector?

The lawful use of facial recognition technology, particularly in a commercial retail setting, is increasingly challenging under Australian privacy law.

Even where the primary purpose for deploying the technology is for crime or fraud prevention purposes, consent must be obtained (other than in limited circumstances), from all individuals who facial images are captured and used by the technology for facial verification or identification purposes. This follows the Australian Privacy Commissioner’s investigations into the use of facial recognition technology by Bunnings and Kmart, which ultimately resulted in the Commissioner finding both retailers breached the Australian Privacy Principles. While Bunnings has appealed the Commissioner’s finding, the question remains of whether (and how) the technology can be implemented and used by Australian organisations regulated by the Privacy Act and whether such use aligns with evolving community expectations in Australia around surveillance and the impact on an individual’s privacy.

The path forward, at this stage, for Australian organisations considering the use of facial recognition technology (particularly in the retail sector) is not a clear run. It requires clarity on purpose and proportionality of its use, how valid consent can be obtained (and maintained) and the appropriate privacy safeguards to mitigate the risk of privacy harm for individuals.

What is facial recognition technology?

Facial recognition technology, or FRT, involves the collection of a digital image of an individual’s face, the extraction of their distinct features (or vectors) to create a biometric template. This template is then compared against one or more biometric templates stored in a database for the purposes of facial verification or identification. Increasingly, this technology is being deployed for law enforcement purposes, but also in the commercial retail environment.

The use of FRT first came under the scrutiny of Australia’s privacy regulator in 2021, when then-Commissioner Angelene Falk considered whether Clearview AI, through its use of FRT to collect Australian’s facial images and biometric information breached the Privacy Act (it had).

How are facial images and features treated under the Privacy Act?

In Clearview AI, the Commissioner confirmed that:

  • facial images, such as in digital photos uploaded to social media platforms or captured in CCTV footage, are personal information under the Privacy Act, where an individual is identified or reasonably identifiable from the images; and
  • where digital images of a person’s face are converted to biometric templates, or where biometric information (such as facial features) is used for automated biometric verification of identification purposes, it is sensitive information.

Sensitive information is afforded a higher level of protection under the Privacy Act and can, generally, only be collected (which includes through generation of new sensitive information) with the consent of the individual, unless an exception applies.

The findings in the Clearview AI Determination are now supported by the subsequent Bunnings and Kmart Determinations in 2024 and 2025, respectively.

Bunnings Determination 

In November 2024, Commissioner Carly Kind found Bunnings breached Australians’ privacy by collecting their personal and sensitive information through the use of FRT within 63 stores over a three year period.  The FRT captured the faces of every person who entered those stores (“likely hundreds of thousands of individuals”).  Individuals’ facial images were compared against a database of people Bunnings had identified as posing a risk, for example, due to past crime or violent behaviour.

Bunnings position was (and remains) that the use of FRT is essential to its operations; for the safety of staff and customers, and to proactively prevent theft and acts of violence within its stores. It therefore sought to rely on one or more of the “general permitted situations” exceptions under the Privacy Act to collect sensitive information without consent.

However, the Commissioner determined that no exception applied to permit Bunning to collect the facial images without consent, and that Bunnings had failed to comply with several of its obligations under the Australian Privacy Principles with respect to its use of FRT.

In a post about the Bunnings Determination, Commissioner Kind acknowledged “the potential for facial recognition technology to help protect against serious issues, such as crime and violent behaviour”. However, she warned that any possible benefits need to be weighed against the impact on privacy rights, as well as our collective values as a society: “just because a technology may be helpful or convenient, does not mean its use is justifiable. In this instance, deploying facial recognition technology was the most intrusive option, disproportionately interfering with the privacy of everyone who entered its stores, not just high-risk individuals.”

This warning was further echoed in the 2025 Determination following the Commissioner’s investigations into Kmart’s use of FRT.

Kmart Determination

In September 2025, the Commissioner found Kmart also breached its obligations under the Privacy Act through the use of FRT within its stores. Kmart’s use of FRT was for a slightly different purpose: to detect where individuals might be committing refund fraud and to identify people who had been suspected of committing refund fraud or theft.

Kmart used FRT in 28 of its stores over a two year period. Everyone who entered these stores during this time, including where they presented to a returns counter, had their facial images captured and analysed by FRT. The Commissioner found this was a collection of biometric information by Kmart, for which it did not have consent, and that Kmart’s use of FRT within its stores was disproportionate to its stated purpose.

Like Bunnings, Kmart also sought to rely on the “permitted general situation” exemption to handle sensitive information without consent, on the basis that it was necessary to address unlawful activities or serious misconduct (that is, the refund fraud). The Commissioner has stated that this exemption did not apply.

In a blog about the Kmart Determination, Commissioner Kind stated the collection of biometric information [of everyone entering the stores]… when only a small proportion may have been involved in or suspected of refund fraud, was a disproportionate interference with privacy. I also considered that the FRT system had limited utility and there were other less privacy-intrusive methods available to address refund fraud”. She further commented that the “threshold for reliance on the exemptions in the Privacy Act in relation to the need to gain consent… is a high bar that must be cleared…for good reason”.

Lessons from across the ditch

In May 2025, the New Zealand Privacy Commissioner released the results of its inquiry into supermarket chain, Foodstuffs, trial of FRT conducted across 25 supermarkets in the North Island.  The trial considered whether the use of FRT by Foodstuffs was an effective tool in reducing serious retail crime compared with other less privacy intrusive options.

Rather than being a regulator investigation or enforcement action, the Office of the Privacy Commissioner (OPC) engaged with Foodstuffs as they developed their operating model and provided feedback on privacy controls as the trial progressed. An innovative and collaborative approach to navigating a difficult problem.

The OPC’s overall finding following the trial was that the FRT operating model deployed by Foodstuffs complied with the New Zealand Privacy Act: “The OPC found while the level of privacy intrusion was high because every visitor’s face is collected, the privacy safeguards used in the trial reduced it to an acceptable level”.

The privacy safeguards included:

  • Fit for purpose technology and the system was set up to only identify people who had engaged in seriously harmful behaviour, particularly violent offending, and immediate deletion of images that did not result in a positive match against the watchlist.
  • Human in the loop: Match alerts were verified by trained staff, ensuring that human decision making was a key part of the process.
  • Access to the FRT system and information was restricted to trained authorised staff only. There was no sharing of watchlist information between stores.
  • Images collected were not permitted to be used for training data purposes.

The Biometric Processing Privacy Code, made under the New Zealand Privacy Act and in force from November 2025 provides further guidance and guardrails for the safe use of biometrics, including FRT.

While the OPC did not sign-off on the permanent use of FRT by Foodstuffs, it did help identify privacy safeguards and improvements that would need to be made ahead of any such roll-out, including effective monitoring to mitigate bias and inaccuracies, and ongoing review of the use of FRT to ensure it remains an effective tool for reducing serious harm offending.

Can Australian organisations lawfully deploy FRT?

The Privacy Commissioner has made it clear that the use of FRT is not, in and of itself, illegal in Australia.

Where the OAIC investigates the use of FRT, it will consider the use on a case-by-case basis, taking into account the specific contextual factors at play. The factors set out in the OAIC’s published guide for businesses considering using FRT in a commercial or retail setting is instructive as to the factors that are likely to be relevant in such investigations.

For regulated entities considering implementing FRT in their operations or as part of their service offerings, the starting point needs to be clarity on the purpose for such use, whether the use of FRT is proportionate to such purpose, and how the use of FRT might impact the privacy of individuals.

Often, conducting a Privacy Impact Assessment (PIA) prior to implementation and ensuring a privacy by design approach is adopted is the best way to go about this (but should not be the total sum of consideration given to the risks associated with the use of FRT). A PIA will enable the entity to consider the issues identified by the Commissioner in the Bunnings and Kmart Determinations, potential risks of privacy harm and risk controls. The lessons learned and privacy safeguards in the New Zealand Foodstuffs trial are also useful reference points.

Even if the legal assessment is that the proposed use of FRT can be used in compliance with the APPs and that risks mitigations have been identified and applied following conduct of the PIA, organisations should consider the question of whether FRT should be used in their particular circumstances. That is, what is the type of place within which the FRT will be used, and what are customer and broader community expectations around the use of FRT in that place, and the potential ethical and reputational impacts of implementing the technology.

Given the potential for changes in regulatory response and community expectations, and the inherent risks of using FRT, it will be important for any decision to implement and use FRT is kept under review, not only from a regulatory compliance perspective, but also ethical and reputational perspectives. The PIA (and any supporting risk management framework) should identify appropriate points in time post-implementation at which the decision should be reviewed to ensure any use of FRT continues to comply with privacy laws, regulatory guidance and consumer expectations.

Key Contacts