After more than two years of litigation, Amazon and Microsoft have won summary judgment in two class actions alleging violations of the Illinois Biometric Information Privacy Act (BIPA). Vance v. Amazon.com, Inc.case number C20-1084JLR and Vance vs. Microsoft Corp., case number C20-1082JLR. Both decisions were made on October 17, 2022 by Judge James L. Robert of the United States District Court for the Western District of Washington.
The plaintiffs in both cases, Steve Vance and Tim Janecyk, uploaded photos to the photo-sharing website Flickr between 2004 and 2014, some of which were labeled as “facial diversity.” claimed to have been included in the dataset. By a researcher working at IBM. As its name suggests, this dataset was developed by IBM to help researchers reduce racial and other biases in facial recognition technology. This data set included both photographs and a ‘face coding scheme’ created and implemented by IBM researchers.
In keeping with the purpose of the dataset, IBM has made the Diversity in Faces dataset freely available to others. provided that you agree to use it only for non-commercial research purposes and not to identify individuals in the data. set photo.
In granting summary judgment, the court found that researchers working with Amazon and Microsoft had access to the Diversity in Faces dataset, but found it unsuitable for their respective research goals for a variety of reasons. . Or because it wasn’t shot from the right angle. Nonetheless, plaintiffs Vance and Janecyk argued that because plaintiffs were residents of Illinois, Amazon and Microsoft provided BIPA-compliant notices before obtaining biometric data through the Diversity in Faces data set, Claimed that BIPA-compliant consent must be obtained. Plaintiff also alleges that Amazon and Microsoft have illegally profited from biometric data in violation of BIPA 15(c). (The profit claim against Microsoft was dismissed in the March 2022 pleading. See Vance v. Microsoft Corp., 534 F. Sapp. 3d 1301, 1309 (WD Wash. 2021).) Finally, plaintiffs asserted claims of unjust enrichment in both cases.
In May 2022, after more than two years of active litigation, Amazon and Microsoft sought summary judgment on the grounds that (1) BIPA does not apply abroad and all related conduct occurred outside of Illinois; Argued that BIPA’s claims should be dismissed. , (2) to apply BIPA as plaintiffs are attempting to do would violate the dormant commerce clause of the U.S. Constitution, and (3) BIPA would require that all unidentified individuals in the facial diversity data set should not be construed as requiring the (impossible) notice and consent of Amazon and Microsoft also hold that (4) plaintiffs have failed to show that Amazon or Microsoft unjustly retained an advantage against plaintiff’s detriment, and therefore their unjustified advantage claim should be dismissed; claimed to be. Judge Robert granted the motion and dismissed all claims against both Amazon and Microsoft based on the first and fourth arguments.
In dismissing BIPA’s argument on the basis of the extraterritoriality defense, Judge Robert (like all other courts that have considered the issue) agreed that BIPA does not apply to extraterritoriality. Rather, the law only applies if the relevant activity “occurred primarily and substantially in Illinois.” Plaintiffs argued that the test was satisfied because they were Illinois residents, were injured in Illinois, and the photographs were taken in Illinois and the photographs were uploaded to the Internet in Illinois. , determined that these claims failed to identify conduct by Amazon or Microsoft that was principally or substantially conducted in the State of Illinois. The court also noted that even though plaintiff’s photos were originally uploaded in Illinois, they were uploaded to his Flickr platform by a third party and were developed by IBM’s third-party researchers. Amazon and Microsoft may have received the data set at a later date, but plaintiffs were unable to identify evidence that any of the defendants’ agents downloaded, reviewed, or evaluated the data set. in IllinoisRather, evidence indicates that any downloads, reviews, or ratings by Amazon were made in the locations where the Company’s respective employees or contractors are based (Washington and Georgia for Amazon, Washington and New York for Microsoft). showed that it would.
In Microsoft’s case, “encrypted chunks” of data bits from the Diversity in Faces data set are stored on servers in Chicago, Illinois and elsewhere in Texas, Washington, and/or California. It was also discovered that there may be But Judge Robert ruled that this possibility was inappropriate. [Diversity in Faces] For the Illinois data center dataset, the relevant section of BIPA regulates data retrieval only, not post-retrieval encrypted storage of data. Plaintiffs’ only remaining allegation is that they allege violations of BIPA Section 15(b), which requires notice and consent before obtaining biometric data, and Microsoft failed to do so. receive Given that the biometric data is in, or profited from, Illinois, the possibility that Microsoft stored data in Illinois was irrelevant.
Judge Robert also held that because neither Amazon nor Microsoft used the Diversity in Faces dataset in their business, they could not have unduly retained an interest that was detrimental to plaintiffs, and found plaintiffs’ unfair enrichment. dismissed the claim.
[View source.]