X

What the Diversity in Faces Litigation Means for Biometric Technologies

  • 0 comments
  • 4 min read
  • May 3, 2024

In 2020, Illinois residents whose photos were included in the Diversity in Faces dataset brought a series of lawsuits against multiple technology companies, including IBM, Facefirst, Microsoft, Amazon, and Google alleging violations of Illinois’ Biometric Information Privacy Act.[1] In the years since, the cases against IBM and FaceFirst were dismissed at the agreement of both parties, while the cases against Microsoft, Amazon, and most recently, Google were dismissed at summary judgment.

These cases are unique in the landscape of BIPA litigation because, in all instances, defendants are not alleged to have had direct contact with the plaintiffs. Instead, plaintiffs alleged that the defendants used a dataset of photos created by IBM (the Diversity in Faces, or DiF, dataset) which allegedly included images publicly made available by the photo-sharing website Flickr. The DiF dataset allegedly implemented facial coding schemes to measure various aspects of the facial features of individuals pictured and was made available to researchers to mitigate dataset bias.

It’s a big deal.

The nature of these allegations sets these cases apart from cases like Monroy v. Shutterfly, Inc., 2017 WL 4099846 or In re Facebook Biometric Info. Priv. Litig., 326 F.R.D. 535 in which plaintiffs alleged that defendants had collected biometric data from them. Here, there was no allegation that plaintiffs used a product created by defendants, gave data to defendants, or interacted with defendants in any way. Thus, these cases demonstrate the importance of considering BIPA when developing biometric technologies or performing research, even if direct interaction with Illinois residents is limited.

Extraterritoriality

It is well-established that BIPA does not apply extraterritorially to conduct outside of Illinois. The DiF cases considered whether BIPA’s territorial limits barred plaintiffs’ claims. The courts uniformly declined to grant defendants’ motions to dismiss on such grounds but did eventually grant motions for summary judgment. Both the Amazon and Microsoft courts acknowledged at the motion to dismiss stage that the plaintiffs did not upload any data to the defendant companies, that they did not directly use the defendants’ products, and that the plaintiffs did not allege that the defendants had obtained the DiF dataset from Illinois. However, the courts allowed discovery to assess not only typical factors such as the plaintiff’s residency and the location of harm, but also “[i]nternet-specific factors, such as where the site or information was accessed, or where the corporation operates the online practice.”

Ultimately, all courts to rule on the question found that BIPA did not apply as the events in question did not occur primarily and substantially in Illinois. To support this finding, the Amazon and Microsoft courts noted that entities other than the defendants were responsible for collecting and generating facial scans from the photographs. Additionally, the Amazon court found that there was no evidence that employees had downloaded, reviewed, or evaluated the DiF dataset in Illinois. Similarly, the Google court stated that plaintiffs had not alleged any “direct interaction” that would give rise to the alleged BIPA violations. The Microsoft court went further by stating that even if Microsoft’s systems “‘chunked,’ encrypted, and stored the DiF Dataset on a server in Illinois,” any connection between Microsoft’s conduct and Illinois would still have been too attenuated for BIPA to apply.

Unjust Enrichment

Plaintiffs also brought unjust enrichment claims, alleging that defendants unlawfully acquired plaintiffs’ biometric information and profited from its dissemination. On summary judgment, the Microsoft and Amazon courts found that, because employees did not use the facial annotations in the dataset and did not use the dataset to train or improve their facial recognition technologies, there was no unjust enrichment. It is worth noting that these decisions relied on highly fact-specific analyses citing multiple relevant depositions.

In conclusion, a key observation emerging from this line of cases is that those who did not settle were dismissed at summary judgment once discovery showed that the defendant’s actions were not connected to Illinois and that they did not use the DiF dataset to improve their technologies. Though this trend may slow the rate at which new BIPA litigation is filed against companies that use biometric data to improve their technologies, companies can still consider mitigating risk and improving their chances of prevailing on motions to dismiss by closely examining the source of any biometric data and evaluating whether consumer consent was obtained.

Vik:
Leave Comment