I AM A SCHOLAR and researcher at the University of California, Berkeley, working at the intersection of technology, law and discrimination.
I recently performed an empirical experiment involving commercial artificial intelligence [ A.I. ] image generation tools. In structured testing across 25 of the widely used paid A.I. image-generation and headshot platforms, I found a rather consistent pattern.
When photographs of hijab-wearing women were uploaded, 22 out of the 25 platforms removed the hijab entirely and and replaced it with A.I.-generated hair.
The remaining three produced inconsistent results, sometimes retaining a distorted or partial version of the head covering. The pattern appeared across multiple services, suggesting a systemic design issue rather than an isolated glitch.
The harms caused by A.I. systems are already well documented, from Amazon's hiring algorithm discriminating against women to GROK generating harmful images that triggered international criticism.
But One dimension has received far less attention, the erasure of religious identity.
In these tests, the hijab was not mis-rendered or inaccurately drawn. It was removed altogether.
These findings raise concerns that go beyond technical errors. The users did not request the removal of the religious head covering, and during image processing there was no option to retain the hijab or to consent to its removal.
The alteration occurred automatically.
The consistency of the outputs also suggests possible exclusion or underrepresentation in the training data used to build these systems.
For Muslim women who wear it, hijab is a matter of dignity and religious identity, not simply a stylistic feature.
When identity markers are silently erased, the issue is not only bias, but also structural exclusion, trained on limited data are then deployed worldwide, the consequences extend far beyond a single image.
What obligations do A.I. companies have when their products systematically reshape how certain groups appear?
And what remedies are available when these systems are built in one jurisdiction and used worldwide?
The World Students Society thanks Mahwish Moazzam, Berkeley, USA.
0 comments:
Post a Comment
Grace A Comment!