CoinBase Used Facial Recognition Tech From Controversial Startup Clearview AI

CoinBase Used Facial Recognition Tech From Controversial Startup Clearview AI

Published: March 5th, 2020

Information obtained by US news site BuzzFeed has shown that more than two thousand companies are using or trialling Clearview’s software, with Coinbase featuring prominently on the list.

Critics have raised privacy concerns over the company's technology, which scrapes data from sites like Facebook and uses it without permission.

Given how much emphasis is placed on privacy in blockchain applications, seeing Coinbase’s name on the list came as a surprise for many in the crypto community.

BuzzFeed reviewed Clearview company documents and revealed that more than 2,200 companies, state agencies, and individuals have used or tested Clearview AI's search function. Facebook, Google, Youtube, and Twitter have all taken the company to task over data taken from their platforms, with court action either pending or underway.

Who uses Clearview’s tech?

Clearview's search tool uses data and images pulled from public sources like social media but without end-user permission. The company says its mission is to help law enforcement better investigate crime and track down suspects.

Unsurprisingly, law enforcement agencies and police forces feature prominently on the Buzzfeed list, but many other private organisations have used it, from the NBA to Eventbrite. More than 45 banks have also tested the tool, including Wells Fargo and Bank of America.

Clearview lists the US Attorney's Office for the Southern District of New York, famous retail chain Macy’s, and now Coinbase.

CoinBase told Buzzfeed that the company had trialled Clearview’s technology because it had ‘unique requirements related to security and compliance,’ but had not used the software with customer data.

In a statement, Coinbase said: ‘We tested Clearview AI to see if it could strengthen our efforts to investigate fraud, protect employees, and defend against physical threats to our offices. We haven't made any long-term decisions about our use of Clearview AI.’

Clearview AI: a brief but controversial history

Before if found itself under the media glare, Clearview AI had mainly operated behind the scenes, with roughly USD 7 million in funding, according to startup investment tracker Pitchbook.

Controversy pushed it into the public eye in January of this year when the New York Times published an investigative piece on Clearview AI's business model – with extensive focus on the privacy risks it posed.

Because of the company’s ‘free trial’ policy, potential clients can conduct test searches of up to 3 billion photos in Clearview’s database. Even if a company hasn’t become a paying customer, individual employees might still be testing the software.

According to Buzzfeed’s report, several organisations the news site contacted didn’t know that employees were using the facial recognition software.

The New York Police Department initially denied using Clearview, even though logs showed that some 30 NYPD officers had conducted more than 11,000 searches through the software. In Canada the Ontario provincial Police force (OPP) also denied using the software, only to reverse itself after discovering that several OPP officers in multiple divisions had been testing the software.

Raising the ire of social media

Twitter took legal action against Clearview AI in January, saying the company had violated its platform policy. According to Twitter's developer agreement, third parties can’t use information pulled from Twitter content or display, distribute, or share it with any state entity for surveillance.

Google and Facebook have all since issued cease-and-desist orders with a demand that Clearview AI stops gathering images from their platforms.

Facial recognition tech has angered privacy advocates who call it an invasive technology. Other critics have made the point that that facial recognition hasn’t yet proven its accurate – particularly in the law enforcement settings Clearview AI is targeting.

The tech has been shown to struggle with faces that aren't male and Caucasian, making it susceptible to accusations of in-built racial and gender bias. And some facial recognition tools used by police haven’t been reviewed or approved by independent experts.

The technology that powers Clearview’s algorithms remains a secret. Clearview’s small pool of early investors includes controversial investor Peter Thiel, who sits on the board of Facebook and co-founded Palantir, a data analytics company that’s is a favourite of law enforcement.

Since the Buzzfeed report, Apple released a statement saying it too had cut off dev access to the Clearview AI iOS app, saying that it violates the developer agreement.

The company was using a software distribution tool from Apple that enabled developers to share their apps within their own companies. Clearview used it as the primary means of distributing its iPhone facial recogniton app to its more than 2,200 customers – violating the terms of its Apple developer agreement.

Adding to Clearview’s woes, earlier this month, the company confirmed that its full client list had been stolen by someone with "unauthorised access." The company said it hadn’t been hacked, rather that a "flaw" had provided unauthorised access to its client list – a rationale unlikely to comfort privacy-sensitive crypto enthusiasts or Coinbase customers.

Having law enforcement agencies upload sensitive photos to the servers of a company whose ability to protect its data is weak, doesn’t inspire confidence.

In defence of Clearview AI – sort of

In an update to its original investigative report, the New York Times said that there was, in fact, a strong use case for Clearview’s technology: finding victims of child abuse.

Investigators told the newspaper that Clearview AI ahs helped them identify victims featured in child abuse videos and photos, leading them to names or locations they might not have been able to locate otherwise.

One ex-chief of police told the Times that putting images of 21 victims from a single offender helped them ID 14 minors.

Following the original Times’ story, New Jersey barred its police from using Clearview’s technology. Canada’s privacy agencies are also investigating Clearview to determine if its technology violates the country’s privacy laws.

Show Results