Legal Matters

Artificial intelligence, social media, and the law

Published

on

BY NANA ADJEI-POKU

Earlier this year, you may have heard about numerous police services admitting they have been using the services of a company (Clearview AI), unbeknownst to the chiefs of these various police services.

Well, let me provide a brief summary regarding the history of this company, the services they have “quietly” been offering and the potential impact on our individual privacy rights as citizens.

In 2017, business partners Hoan Ton-That and Richard Shwartz launched software named Smartcheckr.  This was a facial recognition program, but they weren’t sure who their target market would be.  Shortly after, they changed the name to Clearview AI, and this is the software program that they marketed to law enforcement agencies in the United States.

One of the first U.S. law enforcement agencies to utilize the program is the Indiana State Police.  The use of this program enabled them to solve one of their cases within 20 minutes.  The case involved a shooting that was recorded on a witness’ cell phone.  The accused was identified as a result of having his photo elsewhere on social media.  After a quick upload of the video to the program, a comparison to the video was matched and BOOM, the police got a match and the perpetrator arrested was charged.

Clearview AI currently has a database of approximately 3 billion images from the following companies:  Facebook, YouTube, Instagram, Twitter, Google and tons of others.  The program allows users to search images and provides links that will give details such as where the photo appeared, where the subject resides, whom the subject is in a relationship with, amongst other very personal details.

Clearview AI’s algorithms do not require photos of subjects looking straight in the camera, nor do they have to be perfect photos with subjects staring right in the camera. An individual can be wearing glasses or have something obstructing their face and can still be searched and potentially matched.

Although the software program appears to have a successful match 75% of the time, there is a flaw especially when it comes to visible minorities and the results may provide false matches.  There hasn’t been any detailed information as to why this occurred when searching for visible minorities, but I guess this may be something we may learn sooner or later.

Recently, cease and desist lawsuits have been commenced on behalf of Facebook, Google and YouTube claiming that Clearview AI did not obtain consent of any users whose photos it searched and collected.

In February 2020, Toronto Police Services, OPP, RCMP, Ottawa Police, Durham Police, and London Police Services all admitted to using this app.  What was surprising was that most of the chiefs of these police services were not aware this was being used.  Former Ontario privacy commissioner Ann Cavoukian stated that London Police Service “Actively concealed information and denied that its officers had used Clearview AI.”  CBC News filed a Freedom of Information request back in August 2019 to confirm that London Police Services had been using this app.  In February 2020, the London Police Service responded and advised that there were no records.  Well, eight days later, they came back with a response indicating that some of its officers “may have” used the program.  Why the secrecy?

Privacy Commissioner Brian Beamish released a statement on February 14th, 2020 advising that his office be consulted if a company or law enforcement agency shows interest in using the software.  I find that there will be legal ramifications if it is found that individuals’ images that were not publicly available were collected and stored in the database, which would be a direct violation of the privacy laws of Canada.

As a community we have to question:

  • How were these various police services able to utilize this, “controversial and unregulated surveillance tool” as critics are calling it, without their respective chief’s knowledge?
  • How long have they actually been using this program?
  • Where is the oversight in our police services if things like this are happening under the radar?
  • If this program had not been exposed, would we have been made aware and if so, when?
  • Why did London Police Service deny and then eight days later, admit they used the app?

There are so many questions surrounding the secrecy and use of this program.  It is my hope that the Privacy Commissioner’s update will provide us with more information.  It will be interesting to see where this leads now that it’s in the forefront for the public’s knowledge. You may want to check your privacy settings on your social media sites. But then again, it may be too late, you may be one of the three billion currently in the database.

1 Comment

  1. Ken Heath

    March 31, 2020 at 8:55 am

    Very interesting read. This facial recognition has been used for quite sometime now…no? I have clients who have had multiple driver’s licenses in the system, only to have them merged into the 1 legitimate licence. Their outstanding fines have been accumulated under the legitimate driver’s licence. I client had 3…and all were merged.

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version