2/21/2018

Headline Feb 22, 2018/ ''' ALGORITHMIC ACCOUNTABILITY '''


''' ALGORITHMIC 

ACCOUNTABILITY '''




SO, IS ARTIFICIAL INTELLIGENCE PICKING up some damn and darned *real biases* from the real world? And to understand that -

Ms. Buolamwini, the researcher, joined the M.I.T Media Lab, and for sure, she soon ran into missing - face problem again. It was only when she put on a white mask did the software recognize her as a face.

So she turned her attention to fighting the bias built into digital technology. Now 28, and a doctoral student, after studying as a Rhodes scholar and a Fullbright fellow, she is an advocate in new field of :

*''Algorithmic Accountability,'' which seeks to make automated decisions more transparent, explainable and fair*.

Her short TED-talk on coded bias has been viewed more than 940,000 times, and she founded the  Algorithmic Justice league, a project to raise awareness of the issue.

In her newly published paper, which will be presented in a conference this month, Ms. Buolamwini studied the performance of three leading face recognition systems - by

Microsoft, IBM and Megvil of China - by classifying how well they could guess the gender of people with different skin tones,

These companies were selected because they offered gender classification features in their facial analysis software - and their code was publicly available for testing.

She found them all wanting.

To test the commercial systems, Ms. Buolamwini built a data set of 1,270 faces, using faces of lawmakers from countries with a high percentage of women in office.

The sources included three African nations with predominantly dark-skinned populations, and three Nordic countries with mainly Light-skinned residents.

The African and Nordic faces were scored according to a six-point labeling system used by dermatologists to classify skin types.

The medical classifications were determined to be more objective and precise than race.

Then each company's software was tested on the curated data, crafted for gender balance and a range of skin tones.

The results varied somewhat. Microsoft's error rate for darker skinned women was 21 percent, while IBM's and Megavil's rates were nearly 35 percent. They had error rates 1 percent for light-skinned males.

Ms. Boulamwini shared the research results with each of the companies. IBM said in a statement to her that the company had already improved its facial analysis software and was ''deeply committed'' to ''unbiased'' and ''transparent'' services.

This month, the company said, it will roll out an improved service with a nearly 10-fold increase in accuracy on darker-skinned women.

Microsoft said that it had ''already'' taken steps to improve the accuracy of our facial recognition technology'' and that it was investing in research to ''recognize, understand and remove remove bias,''

Ms. Boulanwini's co-author on her paper is Timnit gebru, who described her role as an adviser. Ms. Gebru is a scientist at Microsoft Research, working on its Fairness Accountability Transparency and Ethics in A.I.group.

Megvil. whose Face++  software is widely used for identification in online payment and ride-sharing services in China, did not reply to several requests for comment, Ms. Buolamwini said.

Ms. Buolamwini is releasing her data set for other to use and build upon.

Ms. Buolamwini is taking further steps in the technical community  and beyond. She is working with the Institute of Electrical and Electronic Engineers, a large professional organization in computing, to set up a group to create standards for accountability and transparency in facial analysis software.

She meets regularly with other academics, public policy groups and philanthropies that are concerned about the impact of artificial intelligence.

Darren Walker, president of the Ford Foundation, said that the new technology could be a ''platform for opportunity,'' but that it would not happen if it replicated and amplified bias and discrimination.

''There is a battle going on for fairness, inclusion and justice in the digital world,'' Mr. Walker said.

Part of the challenge, scientists say, that there is so little diversity within the A.I. community.

''We'd have a lot more introspection and accountability in the field of A.I. if we had more people like Joy,'' said Cathy O'Neill, a data scientist and author of ''Weapons of Math Destruction.''

Technology, Ms. Buolamwini said, should should be more attuned to the people who use it and the people it's used on.

''You can't have ethical A.I. that's not inclusive,'' she said. ''And whoever is creating the technology is setting the standards.''

With respectful dedication to the Future, Students, Professors and Teachers of the world. See Ya all on !WOW! - the World Students Society and Twitter !WOW! - the Ecosystem 2011:

''' Implicit Biases '''

Good Night and God Bless

SAM Daily Times - the Voice of the Voiceless

0 comments:

Post a Comment

Grace A Comment!