7/17/2020

Headline, July 18 2020/ STUDENTS : ''' '' FACIAL RECOGNITION FEARS^ '' '''


STUDENTS : 

''' '' FACIAL RECOGNITION

 FEARS '' '''




WRONGFULLY ACCUSED BY AN Algorithm. A facial recognition tool led to a man's arrest for a crime he didn't commit. The case combines flawed technology with poor police work.

ON A THURSDAY AFTERNOON in January Robert Julian-Borchak Williams was in his office at an automotive supply company when he got a call from -
The Detroit Police Department telling him to come to the station to be arrested. He thought at first that it was a prank.

Two officers got out and handcuffed Mr. Williams on his front lawn. in front of his wife and two young daughters. His wife, Melissa, asked where he was being taken. ''Google it,'' she recalls an  officer saying.

''When is the last time you went to a Shinola store?'' one of the detectives asked, in Mr. Williams's  recollection.

Mr. Williams knew he had not committed the crime in question. What he could not have known, is that his case may be the first known account of an American being wrongfully arrested based on a flawed match from a facial recognition algorithm, according to experts on technology and law.

A FAULTY SYSTEM
A nationwide debate is raging about racism in law enforcement. Across the United States, millions are protesting not just the actions of individuals officers, but also the bias in the system used to surveil communities and identify people for prosecution.

Recent studies by M.I.T. and the National Institute of Standards and Technology, or NIST have found that while the technology works relatively well on white men, the results are less accurate for other demographics, in part because of a lack of diversity in the images used to develop the underlying databases.

Then in June, Amazon, Microsoft and IBM announced they would stop or pause their facial recognition offerings for law enforcement. The gestures were largely symbolic, given that the companies are not big players in the industry.

The technology police departments use is supplied by companies that aren't household names, such as Vigilant Solutions, Cognitec, NEC, Rank One Computing and Clearview AI.

Clare Garvie, a lawyer at Georgetown University's Center on Privacy and Technology has written about problems with the government's use of facial recognition. She argues that low-quality search images - such as a still image from a grainy surveillance video - should be banned, and that the systems currently in use should be tested rigorously for accuracy and bias.

''There are mediocre algorithms and there are good ones, and law enforcement should only buy the good ones,'' Ms. Garvie said.

About Mr. Williams's experience in Michigan, she added : ''I strongly suspect that this is not the first case to misidentify someone to arrest for a crime they didn't commit. This is the first time we know about it.''

''We've tested a lot of garbage out there,'' Mr. Pastorini said. These checks, he added, are not ''scientific'' - DataWork does not formally measure the systems accuracy or bias.

In Michigan, the DataWorks software used by the state police incorporates components developed by the Japanese tech giant NEC and by Rank One Computing, based in Colorado, according to Mr. Todd Pastorini, a general manager and a state police spokeswoman.

In 2019, algorithms from both companies were included in a federal study of over 100 facial recognition systems that found they were biased, falsely identifying African-American and Asian faces 10 times to 100 times more than Caucasian faces.

John Wise, a spokesman for NEC, said : ''A match using facial recognition alone is not means for  positive identification.

Mr. Williams's lawyer, Victoria Burton-Harris said that her client is ''lucky,'' despite what he went through.

''He is alive,'' Ms. Burton-Harris said. ''He is a very large man. My experience has been, as a defense attorney, when officers interact with very large men, very large black men, they immediately act out of fear. They don't know how to deescalate a situation.''

''IT WAS HUMILIATING" : Mr. Williams and his wife have not talked to their neighbors about what happened. They wonder whether they need to put their daughters into therapy. Mr. Williams's boss advised him not to tell anyone at work.

''My mother didn't know about it. It's not something I'm proud of,'' Mr. Williams said. ''It's humiliating.''

He has since figured out what he was doing the evening the shoplifting occurred. He was driving home from work, and had posted a video to his private Instagram because a song he loved came on:

1983's ''We Are One'' by Maze and Frankie Beverly. The lyrics go :
I can't understand
Why we treat each other in this way
Taking up time
With the silly, silly games we play.

He had an alibi, had the Detroit police checked for one. Facial recognition is supposed to be only a clue in a case, not a smoking gun, technology providers say.

The Honor and Serving of the Latest Global Operational Research on Technology, Errors, and Sufferings, continues. The World Students Society thanks author Kashmir Hill and Aaron Krolik.

With respectful dedication to the Students, Professors and Teachers of the World. See Ya all prepare and register for Great Global Elections on The World Students Society : wssciw.blogspot.com and Twitter - !E-WOW! - The Ecosystem 2011 

''' Lives - Lever  '''

Good Night and God Bless

SAM Daily Times - the Voice of the Voiceless

0 comments:

Post a Comment

Grace A Comment!