NUDEST

View Original

Snapchat Filters Don't Work On My Face: Here's Why

Combating Racial Bias in Artificial Intelligence

Last summer one of my best friends from college came to visit me in New York. As soon as she walked through my front door I screamed with delight, gave her a hug, and insisted that she join me in my living room so we could catch up.

Halfway through the conversation we realized we absolutely had to commemorate our reunion with a story on Snapchat. She opened the app on her phone, switched the photo to selfie mode, and selected a bunny filter -- you know, that one that gives you a sweet nose, round eyes and big ears?

My friend’s bunny-fied face stared back at us through her phone, while mine remained unchanged. The Snapchat filter wasn’t applying the bunny filter to my face because it couldn’t see that I was even there. Why?

Snapchat filters can’t find my face because I have dark skin.

Even in bright lighting where the whites of my eyes and white teeth contrast against my dark face, Snapchat concludes that no human is present.

Check out this video I just took with two women at my co-working space. Their faces are adorably cute-ified by a filter while mine is just…. Me! So why is this happening?

See this content in the original post

Artificial intelligence is only as smart as the data you use to train it.

If the people developing facial recognition artificial intelligence are using mostly pictures of fair-skinned individuals to train their algorithms, the technology will struggle to identify the faces of darker-skinned people.

In other words, if developers are racially biased, the technology they produce is likely to be racially biased, too.

This is what I suspect to be Snapchat’s issue, and they’re not the only major tech company with a racial bias problem.

Just last year, a Chinese iPhone X user reported that her colleague was able to unlock her phone twice with Face ID technology because the phone could not distinguish between their two Asian faces. However, Apple says Face ID is more secure than their previous technology, Touch ID. Perhaps if Apple had used more Asian women in their training datasets, Face ID would have actually worked for this Chinese user.

The exclusion of people of color in facial recognition technologies is problematic not just from a functionality standpoint, but from a social justice standpoint as well.

When major tech companies release products that capture popular culture like Apple’s Face ID or Snapchat’s filters, certain populations are excluded from using them because of the color of their skin or shape of their face. And it hurts. It hurts because we’re shut out. It hurts because we’re reminded that we’re an afterthought. It hurts because it encourages society to see whiteness as the standard and everyone else as an unworthy exception.

At NUDEST, whiteness is not our standard; diversity is.

We’re solving the racism problem in artificial intelligence by training our skin tone matching technologies on images that represent the full range of diversity in human skin. Unlike the development teams of most tech companies, the developers leading this effort at NUDEST are predominantly people of color and women.

Help us continue to build racially-inclusive artificial intelligence by submitting your selfies

We’ve collected thousands of scans of real human skin in order to accurately match everyone to products in a shade of “nude” that works for them -- but we need to collect even more.

Submit selfies for an opportunity to win a $1,000 Visa Gift Card.

Submit unedited selfies in plenty of natural light for a chance to win a $1,000 Visa Gift Card. The more selfies you submit, the greater your opportunity to win. Feel free to submit the form with fresh images of yourself as many times as you’d like. Thank you for helping us to diversify artificial intelligence!


Artwork by Susannah Price