2/20/2018

Headline Feb 21, 2018/ ''' FACIAL *RECOGNITION* FACULTY


''' FACIAL *RECOGNITION* FACULTY




FACIAL RECOGNITION TECHNOLOGY is improving by real leaps and bounds. Some  commercial software can now tell the gender of a person in a photograph.

*When the person in the photo is a white man, the software is right 99 percent of the time.

But the darker the skin, the more errors arise - up to nearly 35 percent for images of darker skinned women, according to a new study that breaks fresh ground by measuring how the technology works on people of different races and gender.

These disparate results, calculated by Joy Bauolamwini, a researcher at the Media Lab of the Massachusetts Institute of Technology,  show how some of the biases of the real world can seep into  artificial intelligence, the computer systems that inform racial recognition.

In artificial intelligence, data rules. A.I. software is only as smart as the data used to train it. If there are many more white men than black women in the system, it will worse in identifying the black women.

One widely used facial recognition data set was estimated to be more than 75 percent male and more than 80 percent white, according to another research study.

The new study also raises broader questions of fairness and accountability in artificial intelligence at a time when when investment in and adoption of the technology is racing ahead.

Today, facial recognition software is being deployed by companies in various ways, including to help target product pitches based on social media profile pictures.

But companies are also experimenting with face identification and other A.I. technology as an ingredient in automated decisions with higher stakes like hiring and lending.

Researchers at Georgetown Law School in Washington estimated that 117 million American adults are in face recognition networks used by law enforcement  and that African Americans were most likely to be singled out, because they were disproportionately represented in mug shot databases.

''This is the right time to be addressing how these A.I. systems work and where they fail - to make them socially accountable,'' said Suresh Venkatsubramanian, a  professor of  computer science at the  University of Utah.

Until now, there was anecdotal evidence of computer vision miscues, and occasionally in ways that suggested discrimination.

In 2015, for example, Google had to apologize after its image recognition photo app initially labeled African Americans as ''gorillas''.

Sorelle Friedler, a computer scientist at  Haveford College in Pennsylvania and a reviewing editor on Ms. Buolamwini's research paper, and experts had long suspected that facial recognition software performed differently on different populations.

''But this is the first work I'm aware of that shows that emphatically,'' Ms. Friedlier said.

Ms. Buolamwini, a young African-American computer scientist, experienced the bias of facial recognition firsthand. When she was an undergraduate at Georgia Institute of Technology, programs would work well on her white friends, she said but not recognize her face at all.

She figured it was a flaw that would surely be fixed before long.

But a few years later, after joining the M.I,T, Media Lab, she ran into missing face problem again. Only when she put on a white mask did the software recognize her as a face.

By then, face recognition software was increasingly moving out of the lab and into the mainstream.

''O.K, then this is serious,'' she recalled deciding then. ''Time to do something.''

So she turned her attention to fighting the bias built into digital technology. Now 28, and a doctoral student, after studying as a Rhodes scholar and a Fullbright fellow, she is an advocate in the new field of ''algorithmic accountability'' which seeks to make -

Automated decisions more transparent, explainable and fair.

The Honor and Serving of the latest ''Operational Research'' on Science, Technology and Development continues.

With respectful dedication to the Scientists, Students, Professors and Teachers of the world. See Ya all on !WOW! -The World Students Society and Twitter- !E-WOW! -the Ecosystem 2011:


''' Racism - Classism '''

Good Night and God Bless

SAM Daily Times - the Voice of the Voiceless

0 comments:

Post a Comment

Grace A Comment!