Electric Eel

NUDEST throws shades for every skin color

Issue 029
June 19, 2019


“Legibility creates targets as well as safety zones,” wrote the academic Caren Kaplan. She was writing about warfare, but it’s a line I think of when I’m asked about representation in the beauty industry.

People are typically concerned with who brands aspire to attract as their consumers, and how that aspiration is reflected in the products they sell. I remind them that being seen doesn’t ensure you are going to be understood, or even respected. That a brand acknowledges that people of color buy their products does not mean that brand knows, or cares to know, how to cater their products to them. Look at old photos of Naomi Campbell out on the town in the 1980s. For decades, not even the greatest supermodel in the world could get a skin color match in her foundation. In fact, Campbell only starred in her first beauty campaign earlier this year, for NARS’ Spring 2019 line.

Brands want your money, but they hold harmful biases informed by their (mis)understanding of their consumers’ needs and preferences. When it comes to shade-matching, cosmetic brands claim they’re doing a good job of offering wide-ranging shade-matching palettes, but they usually don’t get the pigmentation right for people of color. Brands like Fenty and Make Up Forever, which both offer forty shades in their foundation palettes, are helping to change this. But the reality is that most brands have a lot of catching up to do — if they bother at all.  

One company that caters to would-be consumers across the color spectrum is NUDEST, a shade-matching app led by Atima Lui, a black woman, and run by a staff that is impressively diverse (especially compared to the homogenous employee demographic typical in tech companies). NUDEST provides a database of an incredibly wide range of skin colors so brands can improve their shade-matching efforts across product categories. The company is one of the rare examples of data collecting technology being used for good, rather than the benefit of community surveillance or data harvesting — efforts that historically disproportionately affect people of color. 

I met Atima when she hosted a panel called “Racial and Gender Bias in AI” at The Wing in New York City last year. She invited academics and beauty/fashion industry professionals to participate. A condensed and excerpted version of our talk is below. —Arabelle Sicardi (@arabellesicardi), beauty writer and activist

Arabelle Sicardi: What led you to start this company?
Atima Lui: I grew up in Topeka, Kansas. The only people who looked like me were my immediate family. My dad is a Sudanese refugee and my mother is a very militant black activist. My mother did a good job helping me understand where I came from, but at the same time it was painfully obvious I was different from everybody else, and whenever I tried something on my skin I was reminded that I was ugly. I eventually realized my purpose is to improve the confidence of dark-skinned women and girls. When I realized that the traditional shade of nude doesn’t match 84% of the global population—not just dark skinned people, but the total majority of people—I realized I had a problem I could try to solve.

The decision to start this company, and how I recruited, was that I needed to start with what was available to me and who I connected to. My brother is a computer scientist—he’s my CTO. He wrote the first version of the technology and we recruited together. Our AI Engineer is from Egypt—he’s a Muslim father living in Canada—so he has a connection to the need for representation and the expertise to pull it off.

AS: How long did it take for you to come up with each color category?
AL: For our initial launch in 2016, people could submit a photo of their hand with an area they’d like to match. We save the information, but not the photo, and that’s how we grew the first collection of skintones of over a thousand samples. Once we began focusing on the technology, we started selling it to brands that put our tech on their websites. Any time someone scans their skin using the technology, we see that information and tweak the data accordingly.

It’s more of an art than a science. Color is infinite, so digitally looking at the spectrum meant deciding what dimensions of the data did we want to put into colorspace, and finding the averages within those dimensions. We decided to go with fifty-four shades because it took into account how the customer currently understands the nude space. That led us to nine encompassing categories. It took us until the end of 2017—a full two years.

AS: What would you say your mission was in the long-term: who do you want your technology used by? What kind of companies would use it, if you could see it anywhere?
AL: I want Nudest to be the Pantone of skin tones, to be a standard used across design. Skin tone is represented across all products: from electronics to fashion to making characters of color, so I want to be the medium through which all of this happens.

AS: A lot of tech companies working on data collection have either made a stand to work with the police or refuse to release customer data. People who don’t know the details of your service might be afraid to use it because they’re worried about the data being stored and if it will be sold in the future in ways they didn’t consent to.
AL: Our strategy is to use the technology for a skin tone model similar to a design report. If we could analyze data to see patterns in people who are charged with crimes, we’d analyze it per charge to call out the colorism in the prison industrial complex. But we’re not putting it in the hands of police to find dark-skinned people. Our strategy is to not store the images of people scanned with our tech. Customers absolutely need the right to reach out to companies and tell them to delete their data. We work with brands, and the brands have the responsibility to have a relationship with their customers.

Our politics are gently liberal. We’re about elevating visibility for the marginalized. One of our founders grew up in New York, and the ideas here are more progressive than, say, where I grew up in Kansas, where saying “trans women are women too,” might be considered radical. But it’s still true. If you’re against inclusion, you probably won’t like our product. You should surround yourself with people different from you across all sorts of dimensions. It is how we should be running countries and companies—from a space of inclusion.