US Facial Recognition Firm Ordered to Stop Processing UK and Australian Data and Pay Fine Over Privacy Law Violations

ICO and OAIC Find ‘Serious Breaches’ of Privacy Law

On Nov. 29, 2021, the U.K. Information Commissioner’s Office (ICO) announced a provisional intent to fine Clearview AI over £17 million, alleging several privacy violations related to the company’s use of “scraped” data and biometrics of individuals. More significantly, the provisional order would require the company to stop processing personal data of people from the U.K. and to delete the data collected from U.K. individuals. The ICO’s notice follows a similar announcement that was made by Australia’s Information Commissioner earlier in the month ordering Clearview to cease collecting facial images and biometric templates from individuals in Australia and to destroy existing images and templates collected from Australians. We provide some key takeaways for companies that are building and testing facial recognition and artificial intelligence tools.

In announcing the resolution of a joint investigation with the Office of the Australian Information Commissioner (OAIC), the ICO alleged several privacy violations, including:

  • Failing to process personal data fairly and in a way that people in the U.K. would expect.
  • Failing to implement a process to ensure data is not retained indefinitely.
  • Failing to rely on an appropriate legal basis.
  • Failing to treat biometric data with the sensitivity required of “special categories” data under the EU’s General Data Protection Regulation (GDPR) /U.K. GDPR.
  • Failing to provide appropriate notice.
  • Asking for additional information—in particular photos—from individuals wishing to exercise their rights, which the ICO argues could deter individuals from exercising their rights.

The ICO took issue with Clearview’s collection of images scraped from social media without the knowledge of the individuals involved. Clearview enterprise and law enforcement customers can use an image to search against their purported dataset of over 10 billion individuals. Notably, Clearview stopped offering services to law enforcement and private organizations in the U.K. However, the ICO pointed out that Clearview continues to process personal data regarding U.K. individuals notwithstanding the fact that the company no longer operates in the market.

In a separate announcement published in early November, the OAIC found that Clearview had violated Australian privacy law by:

  • Collecting sensitive personal information without consent.
  • Collecting the information unfairly.
  • Failing to take reasonable steps to notify individuals of the personal information collected.
  • Failing to ensure the accuracy of data disclosed.
  • Failing to implement policies and procedures to ensure compliance with Australian privacy law.

Clearview AI has faced legal challenges in other countries as well, including in the U.S. In particular, a Vermont state court recently rejected Clearview’s motion to dismiss in a lawsuit brought by the Vermont Attorney General under the state’s consumer protection law alleging Clearview processes photographs collected illegally through “screen scraping.” An Illinois state court also rejected Clearview’s motion to dismiss in litigation filed by the ACLU alleging violations of the Illinois Biometric Information Privacy Act (BIPA). Additionally, Clearview faces federal multidistrict litigation consolidated in the Northern District of Illinois for alleged BIPA violations.


Companies that create large image databases and process biometric data face heightened legal and regulatory requirements in the U.S. and abroad. Under most general privacy statutes (such as the GDPR and the UK GDPR, and in 2023 the Virginia Consumer Data Protection Act and Colorado Privacy Act) and biometric privacy laws (like BIPA, Texas Capture or Use of Biometric Identifier Act (CUBI), and The Washington Biometric Privacy Act), explicit consent is usually required before processing biometric data. In most cases, notice must be presented prior to collection for the consent to be valid. Many statutes also require a more particularized notice before biometric data can be processed. For example, the California Consumer Privacy Act (CCPA) requires companies to explicitly state in their privacy policies that they process “biometric information,” and BIPA requires that the notice include the company’s retention schedule for biometric information. These notice requirements are virtually impossible to satisfy when the data is generated from information scraped from social media networks. Accordingly, just because information is “publicly available” does not mean the information is fair game to process for any purpose.

These litigation and regulatory enforcement actions in the U.S. and abroad demonstrate that companies should consider data privacy issues from the point of collection and throughout the data life cycle. This consideration should be extended to any third-party collection of data on your behalf. The individual’s reasonable expectations should be considered when determining whether personal data may be processed for a particular purpose. Images posted publicly on social media platforms may not be characterized as biometric data on their own, but as we have seen in BIPA litigation to date, if such images can be used to extract biometric templates for facial recognition purposes, there will be scrutiny. This is particularly true when the processing could significantly impact an individual’s rights or freedoms (for example, when biometric data is used for law enforcement purposes). Under such circumstances, data protection authorities and regulators may scrutinize whether data collection and processing are aligned with individuals’ expectations. For companies involved in building data lakes, especially when the data is collected from individuals with whom the business has no direct relationship, the data-in and data-out processes should be reviewed to ensure they are aligned with the company’s values, consumer expectations, and data privacy laws.