New York-Based Facial-Recognition Technology Ruled Illegal in Canada
Clearview AI’s controversial practice of scraping the Internet for billions of images of people has been ruled illegal in Canada. The joint investigation of privacy authorities, led by the Office of the Privacy Commissioner of Canada, has ruled that the practice represents mass surveillance and infringes on the privacy rights of Canadians. The investigation also found that Clearview had collected highly sensitive biometric information without people’s knowledge or consent, and then used and disclosed this personal information for inappropriate purposes that would not have been appropriate even if people had consented.
Clearview, which was founded in 2017 by Hoan Ton-Thatand, an Australian entrepreneur, collects what is known as “faceprints,” unique biometric identifiers similar to someone’s fingerprint or DNA profile, from photos people post online. To date, Clearview has collected a database of billions of faceprints, which it then sells to its clients. It also provides access to a smartphone app that allows clients to upload a photo of an unknown person and instantly receive a set of matching photos.
One of the biggest arguments in his company’s defense that Ton-Thatand has made in published reports is that there is significant benefit in using its technology in law enforcement and national security, which outweighs privacy concerns of individuals. Moreover, Clearview is not to blame if law enforcement misuses its technology. Similar arguments were used by Clearview in their defense against Canadian authorities, stating that no consent was required because the photos were publicly available on websites.
The decision in Canada likely will lend heft to other legal challenges not only to Clearview’s technology but facial recognition in general. Last May the American Civil Liberties Union sued Clearview for privacy violations in Illinois, a case that is ongoing. Lawmakers in the United States even have proposed a nationwide ban on facial recognition.
Further Reading: