Welcome to INSIGHT by Balance Now   Click to listen highlighted text! Welcome to INSIGHT by Balance Now

Digital Colorism: How Good AI Leads to Better Representation

Image generated from Chat GPT Dall-E (Protected Use)

Have you ever used AI to complete a project? You may want the perfect apple pie recipe from Chat GPT or need clarity to solve a problematic algebra equation. From Apple Inc’s Siri to Open AI’s Chat GPT, generative AI programming has become a frequent tool in the life of the average American. A recent survey conducted by Harvard University economist David Deming and his colleagues found that, among those aged 18-49, almost 40% have used generative AI programs. In the August 2024 iteration of that survey, 24% of respondents had used generative AI at least once within the last week. 

From recipes for baked goods to solving mathematical equations, AI has a wide range of applications in your daily life. One application is in AI image generators, where users generate AI images sourced from their likenesses. However, what happens when the image produced does not reflect the “real you”? This is the primary issue explored in my conversation with Digital Colorism advocate and Stanford “LEAD” Executive Leadership program graduate Christelle Mombo-Zigah. 

Christelle Mombo-Zigah via Linkedin 

What is Digital Colorism? 

Just as AI image generators have emerged in the digital space, so has Digital Colorism. Christelle’s contribution to the space is in conceptualizing that Digital Colorism is a framework for evaluating the biases of AI generators. Christelle achieves this framework with “AI headshots and AI portrait generators through the lens of historical and societal implicit biases.” The six discriminatory factors she analyzes in these AI systems relate to skin tone preference (colorism), hair texture representation (texturism), facial features (featurism), age representation (ageism), and body size (body positivity). 

The motivation for developing this framework comes from Christelle’s experience as a dark-skinned Black woman. First, in 2021, during her work with her current full-time employer, Cisco Systems, she says that her “hair, the hair texture, was not [properly] represented while I was using virtual tools.”

She goes on to describe how, after contacting the CTO at the time, “I shared some examples of videos of me wearing braids, so hair falling [down], or me wearing an Afro with hair defying gravity and the lack of visibility of my hairstyle in an afro through virtual tools. I was appearing bold; my hair was not represented.” The problem persisted in other areas, so she needed to work with data scientists to correct this issue. She also joined the Responsible AI Committee at Cisco to advocate for inclusiveness and intersectionality during that time. 

Why care about Digital Colorism? 

Image generated from Chat GPT Dall-E (Protected Use)

Despite increased awareness, the issue persists in these machine-learning models. She recollects how companies have reached out to her to assess their beauty algorithm and biases in image processing, which have the same pervasive biases with skin tone, body proportions, and hair texture. She recollects the mental health impact of these AI-generated images: “We are constantly, as dark skin users, sent the message that our skin tone is not good enough and needs to [be lightened.]” The implication she then explains is the concept of the false image and lack of authenticity. The images produced by these machine learning models should represent how you present yourself to the outside world. Yet, these machine learning models lack training on Black people and those with differentiated features like body proportions, afro-textured hair, darker skin tones, facial features, and age. 

What Can You Do About Digital Colorism? 

Christelle expresses optimism about increasing representation and inclusion for all identities. “I want to make sure that we are going to exist in the future digital and virtual world, and that our identity is going to be [as] relevant, respected, and celebrated as any other identity.” She emphasizes that each of us has a part to combat biases and prejudice.

Whether they are a data scientist, project manager, product designer, or discussing at the family dinner table, they can bring awareness to the need for representation. Christelle says to those who aren’t immediately affected by digital colorism or systemic biases, “I want them to become change agents as well. This is not something that we can do alone as a black community or dark-skinned communities… This is something that everybody needs to be involved with.”

Are you interested in learning more about Diversity, Equity, and Inclusion? Read our piece on the state of DEI in 2024.

Ian Rowe

Ian Rowe is a content writer and political scientist based in Fort Lauderdale. He is a political science masters' student whose research centers on underrepresented ethnic and religious communities. With a focus on political and racial advocacy, he hopes that his academic work leads to substantive changes in policy direction. He enjoys reading, traveling, and watching movies in his free time.

Leave a Reply

Your email address will not be published. Required fields are marked *

Click to listen highlighted text!