top of page

Meet The Scientist Protecting Women Of Color From The Wrong Side of AI

In 2023, computer scientist and artist, Dr. Joy Buolamwini, was named one of Time’s "100 Most Influential people in AI" for good reason — prejudice that’s often baked into this technology has victimized women and people of color.


At 34, computer scientist and poet, Dr. Joy Buolamwini, has already made her mark as a pioneer in the rapidly developing field of artificial intelligence.

She’s advised President Biden and Big Tech on the benefits and dangers of AI, was named one of Time’s "100 Most Influential people in AI," has worked on documentaries about the subject, and she recently released a book about her personal journey in the space: “Unmasking AI: My Mission to Protect What is Human in a World of Machines.”


Her research as an AI scientist came into focus during her time as a graduate student at MIT: addressing the downfalls in machine learning (the building blocks of AI systems).

At the time, Dr. Buolamwini was working on a face detection technology for an art installation she was building. She noticed the software program was having trouble detecting her skin color. It wasn’t until she decided to place a white mask on her face that it finally started to work properly.

“It was this experience of literally coating in white face that made me think: ‘Is there something more here?’” she explained. “I put on my scientist hat and started to conduct experiments that showed there was actually bias where these systems work better on some faces than others.”

Dr. Buolamwini — who founded the Algorithmic Justice League (AJL) in 2016 to study the social impact and potential harms of AI — shared lessons from her new book, where she explored the intersection of the technology’s development and the dangers of bias in its algorithmic systems.  

Below is the conversation, which has been edited for brevity and clarity:

Know Your Value: In the book you write about a moment where you ran photos of Black women you admire into an AI system, and they were misclassified. How did that shape the significance of your work?

Dr. Buolamwini: I admired people like [former first lady] Michelle Obama, Oprah Winfrey, Serena Williams — women who have a lot of photos of themselves out there — and I started running their iconic images, and sometimes their faces weren’t seen, or sometimes they were labeled as male or other types of descriptions.

I remember one description of Michelle Obama as a child, actually described as “toupee.” Looking at these women that I admire and hold in such esteem — either being misclassified or not even seen by machines — really made me pause. What does it mean for some of the technology from some of the most powerful companies in the world to fail on these faces of people I admire so much? And sometimes the failure wasn’t that they weren’t recognized, but that they were misclassified. And it reminds me of this notion of the exclusion overhead: How much do you have to change yourself to get systems that weren’t made with you in mind to somehow work for you?

Know Your Value: Could you explain the touch points on a day-to-day basis where [AI systems] can generate these consequences?

Dr. Buolamwini: The work I do looks at different ways computers analyze faces for a particular project. You have government agencies that actually adopt that type of facial recognition for access to government services like, for example, the IRS.

When you login, you might be asked to scan your face. If there are failures, either somebody could get into your account and commit fraud, or you can’t even access your own information. So that’s one way [AI systems] can enter people’s lives.

Schools are actually using facial recognition and facial detection on everything from class attendance to E-proctoring, which became particularly popular during the pandemic, as there was more remote learning.

And then there’s the law enforcement use of this technology. I think of a woman named Porsche Woodruff. She was eight months pregnant when she was falsely arrested, due to facial recognition misidentification. I think that part is really important because you could not have anything to do with a crime that has occurred, but your image can be picked up. And she was actually pregnant when she was in the holding cell. When she reported having contractions and after she was released, Woodruff had to be rushed to the emergency room.

So those are different ways in which computers can analyze faces. If there are biases and discrimination, you can end up in on the wrong side of the algorithm.

Know Your Value: Where does the responsibility lie in making these systems safer and less biased?

Dr. Buolamwini: We need legislation — at the federal level — because the legislation then puts in the guard rails for the tech companies. Currently, you have some tech companies that have done a little bit of self-regulation. But all of the U.S. companies that we audited have stopped selling facial recognition to law enforcement following that work.

And also, we need to think about AI governance globally. I do think that all of our stories matter. When you share your experience with AI or your questions about it, you encourage other people to share their stories.

Know Your Value: Why is the data set not representative of more diverse individuals?

Dr. Buolamwini: You would think companies that are trying to get everybody to use their systems would create systems that include more people. So, when we looked into the processes of making these systems, oftentimes it came down to convenience.

For example, to have a face data set of images where you might not have to think about copyright as much, some people will collect faces of public figures, and in particular they’ll collect faces of elected officials. So, who tends to hold power around the world?  If you have a data set that’s made of faces of elected officials, you’re going to see the shadow of the patriarchy reflected there. We saw that in many data sets that were collected that way.

Some of those convenience ways of collecting data also reflect the structural inequalities within society as well. So, you actually have to be intentional to be inclusive. Otherwise, the status quo is going to reflect existing inequality.

Know Your Value: You are both an artist and a scientist. Have you always been confident in both worlds?

Dr. Buolamwini: I feel both worlds were nurtured from a very young age, so I had the confidence to explore that. But as I started down my path of studying computer science and then wanting to be taken as a credible researcher, I was nervous about sharing the poetic aspects of who I am and infusing it into the work I do.

So, I first put out the research — the research that was showing gender bias, racial bias and more. When that was well-received, I felt like, ‘OK, now that the research is establishing what I’m saying, let me put the poetic piece on top of it.’

I was so surprised actually at how well-received it was ... because at the end of the day it’s storytelling and it’s about humanizing what we’re talking about.

Know Your Value: What is your advice, for women of color especially, on embracing their dualities?

Dr. Buolamwini: I would say, start something small. Do an experiment. For me, that experiment was exploring the ‘AI, Ain’t I a Woman’ poem, and seeing how that was received. Then, [I tried] the documentary. It wasn’t everything all at once.

It’s also important to have that peer support group that you can share some of these things with … I wanted to focus on issues of bias and sexism and racism, and I was warned by my colleagues that it might pigeon-hole my career, and it was talking to other women in science — computer science — who encouraged me to do the research anyway. And these were people I respect, very well-meaning people.

I do think sometimes you have to understand that not everybody sees your vision. That’s why you’re the visionary.

Source: Pierre-Bravo, Daniela, “Meet The Scientist Protecting Women Of Color From The Wrong Side of AI”, Know Your Value - Business Culture, MSNBC, https://shorturl.at/cwzGO
14 views0 comments

Recent Posts

See All

It's almost here!

Our 3rd Annual Creative Connections conference!!!!! Mark your calendars for Saturday, April 27th. Now go to eventbrite and register. https://shorturl.at/doxh6. If you are an aspiring author or publish

Have you sign up yet?

Signed up for Felicha Sinegar Stanley's book Launch for WpOwer: The Strength of an empowered Woman to change the world companion action workbook? If not, head on over to eventbrite and sign up today.

Book Launch

Author Felicha Sinegar Stanley is having a book Launch on Saturday, December 30, 2023 @6:00-7:00 pm. Register on Eventbrite. You don't want to miss this exciting event. Could you tell your friends and

bottom of page