The Algorithm That Erased Her

Encoded: 03.12.24 · Nairobi, Kenya

She searched her own name. It didn’t appear.

She searched her own face. It didn’t register.

She searched her own voice. It was flagged as “aggressive.”

She searched her own story. It was buried beneath sponsored lies.

In March 2024, a Nairobi-based researcher discovered that Google’s autocomplete suppressed the names of Black Kenyan women activists unless paired with words like “controversy” or “arrest.”

Facial recognition software failed to detect dark-skinned women unless they wore light makeup or stood in front of white walls.

AI-generated avatars of Black women flooded TikTok and Instagram—created by white developers, monetized by corporations, and programmed to mimic stereotypes.

Meanwhile, real Black women were shadowbanned, demonetized, and flagged for “violating community guidelines.”

Shudu, the world’s first AI supermodel, was created by a white British man. Her beauty was celebrated. Her ownership was not.

In 2024, AI-generated Black influencers were programmed to eat watermelon and fried chicken on livestreams.

This is not representation. This is digital blackface.

This is not inclusion. This is algorithmic exploitation.

Joy Buolamwini, founder of the Algorithmic Justice League, discovered that facial analysis systems from IBM, Microsoft, and Amazon had error rates of 35% for dark-skinned women, compared to 1% for light-skinned men.

These systems failed to correctly classify the faces of Oprah Winfrey, Michelle Obama, and Serena Williams.

When technology denigrates even our icons, what happens to the rest of us?

Black women in AI are fighting back. Angle Bush founded Black Women in Artificial Intelligence to train and uplift technologists.

Ava Flanigan, a student at Spelman College, is building models that detect bias before it enters the system.

Kimberly Bryant founded Black Girls Code to teach girls as young as seven how to build apps, train AI, and rewrite the future.

But Big Tech is still Big Tech. And the algorithm still erases.

Dispatch of Complicated Grief

This is for the Black woman who searched her name and found nothing.

This is for the coder whose pull request was rejected because her syntax “felt emotional.”

This is for the activist whose livestream was flagged while her AI clone danced for profit.

This is for the mother whose face was misclassified, whose voice was silenced, whose child was taught that whiteness is the default.

This is for the ones who were told they were “too loud,” “too angry,” “too visible.”

This is for the ones who were made invisible by code.

This dispatch is not polite. It is not balanced. It is not neutral.

It is encrypted rage. It is forensic grief. It is sacred resistance.

“They trained the algorithm to erase her. We trained the archive to remember.” — Solace Helfire

← Return to Vault