The road to hell is paved with good intentions. While the proverb may be a stretch for now, the latest lawsuit by the American Civil Liberties Union of Illinois (ACLU) against Clearview AI certainly shows that good intentions, when acted upon, may have unintended consequences. Technology utilized in the name of public protection—whether from global pandemics or criminal activity—can have disastrous effects when it comes to civil liberties and privacy.
The ACLU filed a lawsuit against Clearview AI based on violations of Illinois residents’ privacy rights. Clearview AI is a technology company that scrapes images from the internet, primarily from various social media platforms, in order to create a searchable database of individual’s face prints. The company claimed that it sold access to its searchable database to hundreds of police departments and federal agencies in order to protect children and aid victims of crimes. However, a recent data breach showed that Clearview AI actually also sold or provided access to its searchable database to retail chains Walmart and Macys, the NBA, Equinox, and many other non-law enforcement entities.
The ACLU strategically filed its lawsuit in Illinois because Illinois is one of the few states with a biometric privacy law. As Taft’s Privacy and Data Security Insight previously discussed, the Illinois Supreme Court in January 2019 cleared the way for plaintiffs to seek liquidated damages and injunctive relief based upon technical violations of the Illinois Biometric Information Privacy Act (BIPA). Under BIPA, a party who collects “biometric information” must comply with specific statutory requirements. The Illinois Supreme Court held that an individual is an “aggrieved” party when the injury alleged is a violation of a private entity’s requirement to provide specific notice and obtain informed consent prior to collecting biometric information. This holding is very beneficial for the ACLU’s lawsuit against Clearview AI.
The ACLU’s Complaint requests, among other things, that Clearview AI: 1) destroy all biometric identifiers in its possession; 2) inform in writing and obtain written consent from all persons before capturing any biometric identifiers; 3) and establish a publicly-available written policy regarding Clearview AI’s retention schedule and guidelines for permanently destroying any biometric identifiers. Moreover, statutory damages under BIPA may be in the amount of $1,000 per violation or $5,000 per violation if the violation was reckless. Given that Clearview AI has allegedly collected over 3 billion instances of biometric identifiers and face prints, BIPA opens the floodgates for significant damages in the ACLU’s class action.
Since Clearview AI has already pushed back against the numerous cease and desist letters received from technology giants including Twitter, Google, YouTube, Venmo, and LinkedIn, Clearview AI is likely to continue arguing that it has a right to access and index publicly available information when defending itself against the ACLU’s lawsuit. The ACLU’s class action will be an important case to follow as Congress and state legislatures continue to draft laws designed to protect individual privacy interests.
In the battle balancing the public good with privacy, the ACLU’s lawsuit is the latest challenge to the use of facial recognition technology. On June 8, 2020, IBM issued a statement that it would be withdrawing from the facial recognition market due to fears of racial profiling and violations of human rights and freedoms. Two days after IBM’s announcement, Amazon issued a one-year moratorium on police use of its Rekognition facial-recognition technology, allegedly due to the lack of regulations governing the ethical use of such technology. These companies appear to seriously contemplate the fact that the law and regulations are slow to catch up to fast advances in technology, and are waiting to see how they can utilize such technology in a legal and ethical manner. Indeed, while the technology has proven to have many benefits, it’s the unintended consequences are ringing the bell of caution both in practice and in law.
Should you have any questions or issues regarding your or your business’ use of individual data and compliance with privacy laws, Taft’s Privacy and Data Security Practice