NUDEST

View Original

What You Need to Know About Racial and Gender Bias in Artificial Intelligence

“Growing up in Kansas, consuming Seventeen magazine, pre-Instagram, I saw that I didn’t match the single standard, right? And as I built NUDEST, I found that it goes beyond skin tone. Any way that people feel like they don’t match the tall skinny blond model—and there are even tall skinny blonde people who still feel like they don’t match that beauty standard—affects them. It’s important for us to see people who look like us. And also people who don’t look like us so that we can change our biases on what beautiful means,” -Atima Lui, our badass founder and CEO at NUDEST.

At NUDEST we build skin tone matching technology for humans of every color. You know, just generally trying to make machines and software less racist because of occurrences, thoughts, and feelings like Atima’s that happen too often.

On Thursday, July 12th, we held a panel discussion at The Wing with some amazing industry experts about the cultural bias we’ve seen with artificial intelligence. It started with biases we had already noticed in popular facial recognition like the type Snapchat and the iPhone X uses. We wanted to dig deeper into why this is happening, why this is an important topic, and how we can create Artificial Intelligence (AI) that doesn’t have the same racial and gender biases that others do.

We invited Julia Hirschberg, a Percy K. and Vida L. W. Hudson Professor of Computer Science at Columbia University; Erin Carpenter, the founder and CEO of Nude Barre, a women and girl’s line of skin tone intimates and hosiery; and Amanda Jones, the Global Marketing Director for Becca Cosmetics, to be a part of our panel discussion as three women who are killing it as minority groups in the fields of STEM, fashion, and beauty.

This is what we learned:

1.  Machine learning algorithms that are trained on biased data ends up replicating cultural bias

A lot of the problems we see with AI, which has come to subsume many areas of technology that didn’t think of themselves as artificial intelligence before, has to do with the data the machine learning algorithms are trained on. AI trained via deep learning takes in millions and millions of images, words, videos, etc. in order to teach itself. However, if the data itself is biased, the algorithm will replicate that same bias. Which can be frustrating.

Some examples that Professor Hirschberg provided during her presentation include:

  • Automatic speech recognition has historically worked better for men than women and children.

  • Machine translation translates gender neutral pronouns in the context of “doctor” as “he” and “nurse” as “she.”

  • Facial recognition programs either cannot recognize darker faces or recognizes them as gorillas.

2.  Explainable AI is working to identify what features of training data lead to decisions, biased or not

There are ways that we can start combating bias in AI. The first step is recognizing one’s own biases.

“I think you have to learn your own bias before you do anything about other people’s,” Hirschberg pointed out during her presentation. She noted Harvard’s “Project Implicit,” a test that measures your own hidden biases, as way to regularly check in and possibly correct the biases you may hold.

Explainable AI is working towards trying to identify what features of training data lead to decisions, biased or not. Recognizing this bias in training data can then be used to correct the bias. This can be done by adding data with an opposite bias or by developing hybrid systems that correct for the identified bias.

3.  Could a “Made for all” to “Made for you” approach via AI be the way forward?

Just as AI is replicating cultural bias, the beauty and fashion industry also continue to perpetuate these biases. These industries suffer from a lack of diversity, but with brands like Nude Barre and Becca Cosmetics advocating for more diversity and inclusion, we can start paving a way for that in these fields—via corrected AI too.

Carpenter, founder and CEO of Nude Barre, features 12 different shades in her brand, which for white buyers for big box stores equates to too much work and product on the floor. The thought is that the very lightest and the very darkest won’t sell quite as much as the shades in the middle, so why bother? E-commerce takes away some of the fight, but also leaves customers a little anxious because they’re not used to the options and don’t know where to start. This is why the NUDEMETER is such a valuable tool.

“So the awesome thing about the technology that NUDEST provides, that is also on our website and that we utilize with our customers, is that it allows them to shop with confidence. It’s increasing our cart size, it’s allowing women to feel like it’s telling them the right shade to buy,” Carpenter said during the event.

For Becca Cosmetics, Jones labels the brand’s revamp and progression as “Inclusivity 2.0,” an internal term her team uses, that is all about how they can take inclusion and diversity to the next level.

“In the perfect world where this technology exists… let’s say I have 40 shades of foundation (which we do), I would have a piece of content, an ad, and I would shoot it 40 different times for each shade. And when you get on your Facebook feed you would get an ad for your [specific] shade. That is personalization. That is Inclusivity 2.0. That is really talking to her. At that point you’re not feeding her content or information that is not relevant to her, and at that point she starts trusting that you know what she wants specifically, not everyone else. That is the evolution that we are embarking upon,” Jones concluded.

We are so grateful to these empowering women who spoke with us on Thursday. They gave insight to topics we that we hold dearly to our hearts. We would also like to thank the guests and the team at The Wing for also believing in our cause.

AI has become a very broad field and encompasses many different types of technology. With this technology we can change entire industries by building more consciously, and that’s just what we plan to do!