- Lawsuit was filed in Illinois on Thursday by the ACLU and Chicago-based allies
- Clearview AI accused of 'extraordinary and unprecedented violation' of privacy
- Clearview offer facial recognition technology with a 3-billion-picture database
- The Delaware-registered company say their tool is designed to help solve crimes
- They insist access limited to law enforcement and 'select security professionals'
- A New York Times expose of the company in January fueled rising concern
- Clearview's lawyer Tor Ekeland described ACLU's suit as 'absurd'
Facial recognition firm Clearview AI has been sued by the American Civil Liberties Union over accusations of 'an extraordinary and unprecedented violation' of privacy.
The ACLU filed the case in Illinois, with the backing of a consortium of Chicago-based rights groups.
Illinois was the first state in the U.S. to regulate the collection of biometric data, with the introduction in 2008 of the Biometric Privacy Act (BIPA).
BIPA requires companies that collect, capture, or obtain an Illinois resident's biometric identifier — such as a fingerprint, faceprint, or iris scan — to first notify that individual and obtain their written consent.
ACLU said that their lawsuit was 'the first to force any face recognition surveillance company to answer directly to groups representing survivors of domestic violence and sexual assault, undocumented immigrants, and other vulnerable communities uniquely harmed by face recognition surveillance.'
In the court documents, filed in Cook County, Illinois, on Thursday, the ACLU team claim that the facial recognition technology provided by Clearview puts vulnerable people at risk.
'Given the immutability of our biometric information and the difficulty of completely hiding our faces in public, face recognition poses severe risks to our security and privacy,' they claim.
'The capture and storage of faceprints leaves people vulnerable to data breaches and identity theft.
'It can also lead to unwanted tracking and invasive surveillance by making it possible to instantaneously identify everyone at a protest or political rally, a house of worship, a domestic violence shelter, an Alcoholics Anonymous meeting, and more.
'And, because the common link is an individual's face, a faceprint can also be used to aggregate countless additional facts about them, gathered from social media and professional profiles, photos posted by others, and government IDs.'
Nathan Freed Wessler, senior staff attorney with the ACLU's Speech, Privacy, and Technology Project, described Clearview's technology as 'menacing'.
He said it could be used to track people at political rallies, protests, and religious gatherings, among other uses.
The coalition are asking a judge to order Clearview to delete the images, and inform in writing and obtain written consent from 'all persons' before capturing their biometric identifiers.
Tor Ekeland, an attorney for the company, described the law suit as 'absurd' and a violation of the First Amendment, which protects freedom of speech, religion, assembly and protest.
'Clearview AI is a search engine that uses only publicly available images accessible on the internet,' he said.
'It is absurd that the ACLU wants to censor which search engines people can use to access public information on the internet. The First Amendment forbids this.'
Clearview AI was founded in 2016 by Hoan Ton-That, a 31-year-old Australian tech entrepreneur and one-time model.
Ton-That co-founded the company with Richard Schwartz, an aide to Rudy Giuliani when he was mayor of New York.
It is backed financially by Peter Thiel, a venture capitalist who co-founded PayPal and was an early investor in Facebook.
Ton-That describes his company as 'creating the next generation of image search technology', and in January the New York Times reported that Clearview AI had assembled a database of three million images of Americans, culled from social media sites.
The paper published an expose of the company, in which Ton-That described how he had come up with a 'state-of-the-art neural net' to convert all the images into mathematical formulas, or vectors, based on facial geometry - taking measurements such as how far apart a person's eyes are.
Clearview created a directory of the images, so that when a user uploads a photo of a face into Clearview's system, it converts the face into a vector.
The app then shows all the scraped photos stored in that vector's 'neighborhood', along with the links to the sites from which those images came.
Amid the backlash from the January article, Clearview insisted that it had created a valuable policing tool, which they said was not available to the public.
'Clearview exists to help law enforcement agencies solve the toughest cases, and our technology comes with strict guidelines and safeguards to ensure investigators use it for its intended purpose only,' the company said.
Clearview insisted the app had 'built-in safeguards to ensure these trained professionals only use it for its intended purpose'.
However, in February BuzzFeed reported that Clearview's technology was being used by private companies including Macy's, Walmart, BestBuy and the NBA, and even a sovereign wealth fund in the United Arab Emirates.
The New Jersey attorney general has banned state law enforcement from using Clearview's system, and in March the Vermont attorney general sued.
A hearing has been set for the Illinois case for September.