“AI, Ain’t I a Woman:” The Gendered Lens of Facial Recognition Technology

Christie Dougherty

Most facial recognition software cannot identify dark-skinned women. This was one of the most spine-chilling revelations at the Northeastern University Center for Law, Innovation and Creativity’s recent conference, “About Face: The Changing Landscape of Facial Recognition.” The conference highlighted the social and legal implications of surveillance technology. On Friday May 10, 2019, MIT researcher Joy Buolamwini gave a moving presentation highlighting a major problem in facial recognition technology: its gendered and racial lenses. AI, Ain’t I A Woman? Algorithmic Justice League Project, https://www.notflawless.ai/#2 (last visited June 14, 2019). Her research revealed this critical flaw in current software.

In a video less than four minutes long, titled “AI, Ain’t I a Woman,” Buolamwini presented her research on facial recognition technology to a powerful spoken word poem.  Inspired by Sojourner Truth’s “Ain’t I a Woman” speech delivered in 1851 to the Women’s Rights Convention, the video depicts Buolamwini’s findings. Sojourner Truth: Ain’t I A Woman? National Park Service (last visited June 7, 2019).  In her research, Buolamwini ran more than 1,200 images through numerous large tech companies’ facial recognition software, including IBM Watson, Google, Microsoft, Face++, and Amazon. Often, these types of software cannot identify dark-skinned people, because dark complexions are underrepresented in the dataset that is used to test the technology. Buolamwini’s dataset was tailored to focus on a better representation of dark-skinned people. Her video shows her running the images of famous black women, including Sojourner Truth, Shirley Chisholm, Michelle Obama, and Oprah, through the software. The programs returned results that identified the black women as men, and frequently could not identify a gender at all. Sometimes, the software also identified the women as wearing hairpieces.

Buolamwini’s spoken word poem asks, “Face by face the answers seem uncertain/ Young and old, proud icons are dismissed/ Can machines ever see my queens as I view them?/ Can machines ever see our grandmothers as we knew them?”

Buolamwini’s research found that there was a 0.8% error rate for light-skinned men across software, as opposed to a 34.7% error rate for dark-skinned women. Joy Buolamwini & Timnit Gebru, Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, Proceedings of Machine Learning Research 81, 1 (2018). While IBM used her research as a learning opportunity, Amazon publicly challenged her research in a blog post only a day after she published her findings. AI researchers slam Amazon for selling ‘biased’ facial recognition tech to cops, N.Y. Post (April 4, 2019). The blog post reads, “The answer to anxieties over new technology is not to run ‘tests’ inconsistent with how the service is designed to be used, and to amplify the test’s false and misleading conclusions through the news media.” Id. A group of 78 concerned artificial intelligence (“AI”) researchers, including Yoshua Bengio, winner of the Turing Award (the tech industry’s equivalent to the Nobel Prize), came to Buolamwini’s defense and called on Amazon to stop selling its facial recognition software to law enforcement. On Recent Research Auditing Commercial Facial Analysis Technology,Medium (Mar. 25, 2019).

In recent years, Americans have become more accepting of technological exceptionalism—the idea that new technology should be pushed into the marketplace before any known repercussions are illuminated. Regulation of new technologies is limited out of fear of stifling innovation. But this comes at a cost. Buolamwini speaks to this in the last stanza of her poem, “We laugh celebrating the successes/ Of our sisters with Serena smiles/ No label is worthy of our beauty.”