Late last year, as a US government shutdown cut off the Snap benefits that low-income families rely on for groceries, videos on social media cast the fallout in frantic scenes. “Imma keep it real with you,” a Black woman said in a viral TikTok post, “I get over $2,500 a month in stamps. I sell ’em, $2,000 worth, for about $1,200-$1,500 cash.” Another Black woman ranted about taxpayers’ responsibility to her seven children with seven men, and yet another melted down after her food stamps were rejected at a corn-dog counter.
Visible watermarks stamped some videos as AI-generated – apparently, too faintly for the racist commentators and hustlers more than happy to believe the frenzy was real. “You got people treating it like a side hustle, selling the stamps, abusing the system,” the conservative commentator Amir Odom whinged. Fox News reported on the Snap deepfakes as if they were authentic, before issuing a correction. Newsmax anchor Rob Schmitt claimed people were using Snap “to get their nails done, to get their weaves and hair”. (Lost in the outrage was a basic fact: white Americans make up 37% of Snap’s 42 million beneficiaries.)
The fake videos are mere shards in the widening mosaic of digital blackface, a pattern that’s spiked in the past two years as generative AI video tools have become widely accessible. “There’s been a massive acceleration,” says Safiya Umoja Noble, a UCLA gender studies professor and author of Algorithms of Oppression, which focuses on digital biases against Black women in particular. “The digital blackface videos are really pulling from the same racist and sexist stereotypes and tropes that have been used for centuries.” The net effect is a patina of Blackness stripped of cultural obligation or stewardship – minstrelsy in a nutshell.
Coined in a 2006 academic paper, the term digital blackface describes a form of Black cultural commodification repurposed for non-Black expression online. Examples run the gamut: posts in African American Vernacular English, the use of darker-skinned emojis, reaction memes featuring Beyoncé, Katt Williams and other exemplars of Black cool.
“The early research that was done on digital blackface started with white gamers using bitmojis of a different race and changing their vernacular to represent themselves,” says Mia Moody, a Baylor University journalism professor whose forthcoming book, Blackface Memes, links the role of Black users in starting and spreading online trends to subsequent digital blackface. “That’s part of the cultural appropriation, gaining the cultural capital. Maybe you’re a nerdy white guy, but if you use this cool avatar of a Black guy with dreadlocks, people will give you respect. You’re interesting all of a sudden.”
During memeology’s expansion into short-form video, Black expression has increasingly been divorced from authorship, context or consequence. Internet culture scholars say some non-white online creators use AI-generated avatars modeled on familiar Black faces – the beauty influencer, the culture podcaster, the man-on-the-street interviewer; they slip into feeds alongside real Black content creators. Large language models scour digital spaces that gained cachet from Black speech and humor, absorbing their tone and slang. Hume AI is one of many firms that offer synthetic voices for podcasts and audiobooks such as “Black woman with subtle Louisiana accent”, or “middle-aged African American man with a tone of hard-earned wisdom”. In most cases, creators whose speech is scraped from YouTube, podcasts and social media receive no compensation, much less even know their personalities shaped these models.
The Snap reaction clips, however, were a notable escalation in the mainstreaming of digital blackface – less blending in, more weapons-grade stereotyping. Many of those videos were created with OpenAI’s text-to-video app Sora. As Sora’s popularity surged in 2025, users exploited its hyperrealism to sully Martin Luther King Jr’s image, sparking ethical debate around “synthetic resurrection”. Deepfakes showed him shoplifting, wrestling Malcolm X, and swearing through his I Have a Dream speech. Conservative influencers swamped feeds with AI-generated embraces between King and Charlie Kirk, conflating their clashing legacies and cultural martyrdom. Bernice King, MLK’s daughter and the director of his Atlanta-based non-profit, criticized the slopaganda as “foolishness”.
Inevitably the Trump White House has gotten into the act. In January the official White House X account posted a doctored photo of Minnesota activist Nekima Levy Armstrong, darkened and weeping, in the wake of her arrest at a non-violent anti-ICE demonstration. Earlier this month an image portraying the Obamas as apes was circulated via Trump’s own Truth Social account.
Blackface abides at the underbelly of American mass media even as it evolves at breakneck pace. Its roots trace to the minstrel revues of the early 19th century; white performers smeared grease paint made from charred corks on their faces and plastered on oversized white lips to caricature Black features, and performed exaggerated routines of Black laziness, buffoonery and hypersexuality. Thomas D Rice, a Manhattan playwright, shot to fame in the 1830s playing a loping trickster named Jim Crow – a name that quickly became shorthand for the forced racial segregation policies in the American south that endured until the 1964 Civil Rights Act.
In their heyday, minstrel shows were the dominant form of American entertainment – reflected in newspaper cartoons and the enormously popular Amos ‘n’ Andy radio shows. After the civil war, Black performers were largely forced into adopting minstrel elements, at the expense of their personhood once more, just to gain any footing on stage. “The objectives were, first, to make money to help educate our younger ones, and second, to try to break down the ill feeling that existed toward the colored people,” explained Tom Fletcher, a vaudeville minstrel and actor for almost 70 years who died in 1954.
Even as minstrelsy faded from the spotlight by the early 20th century, its toxic residue lingered in American culture – from the shuffling crows of Disney’s Dumbo, to Ted Danson’s infamous 1993 blackface roast of Whoopi Goldberg, to the annual parade of white Halloween revelers in racial masquerade. A decade ago, when the internet was still a black box of sorts, researchers such as Noble and MIT’s Joy Buolamwini were sounding the alarm about the inherent racial biases in the coding of algorithms related to medical treatment, loan applications, hiring decisions and facial recognition. Now it’s out in the open, smearing wider and deeper than any burnt cork routine ever could.
Tech firms have made some effort to stem the tide of digital blackface. Bowing to public backlash, the King family and more prominent estates, OpenAI, Google and the AI image generator Midjourney disallowed deepfakes of King and other American icons. In January 2025, Meta deleted two of its own AI blackface characters – a retiree called Grandpa Brian and Liv, described as a “proud Black queer momma” and “truth-teller” – after allegations of their non-diverse development team fueled a tempest of criticism. Instagram and TikTok and more have made some attempts to scrub viral digital blackface videos, to tepid results. Last summer, efforts to replicate Bigfoot Baddie – the AI avatar of a Black woman as a human-yeti hybrid in pink wigs, acrylic nails and hair bonnets, created by Google’s Veo AI – erupted into a full-blown social media craze, with some users even hustling how-to courses. The avatar is still on socials.
Black in AI and the Distributed AI Research Institute (Dair) are among the handful of affinity groups that have pushed for diversity and community input in AI model-building to address programming bias. The AI Now Institute and Partnership on AI have highlighted the risks of AI systems learning from marginalized communities’ data and noted that tech companies could provide mechanisms such as data opt-outs to help limit harmful or exploitative uses. But widespread adoption has been glacial.
“YouTube alone has something like 400 hours of content per minute being uploaded,” Noble says. “With AI generation, these tech firms cannot manage what’s coming through their systems. So they don’t. Or they do what’s absolutely imperative to the US government. But if you have an authoritarian regime in power, they can use your systems to facilitate propaganda.”
Although the precise impact of AI-generated digital blackface is difficult to quantify, its use by the Trump administration highlights its potential as a powerful tool of official disinformation. The Obama Truth Social entry revived a slur that has festered for years in darker online corners, and one that rhymes with Trump’s sustained efforts to denigrate the former first family. (Trump disclaimed direct responsibility and refused to apologize for that post, which was taken down.) Meanwhile, the White House’s doctored image of Armstrong, altered from an actual photo taken by the Department of Homeland Security and published on their official Twitter account, scanned as a psyop by a government working closely with tech firms to track activists and other perceived enemies of the state.
Beyond laundering bigotry as news, digital blackface exposes Black users to a level of personalized abuse and harassment that harkens to a minstrelsy heyday when racists were fully empowered to express their bigotry unbidden. And then just as now, it seems there is little that can be done to curb the vitriol. “We are living in a United States with an open, no-holds-barred, anti-civil-rights, anti-immigrant, anti-Black, anti-LGBTQ, anti-poor-policy agenda,” Noble says. “Finding the material to support this position is just a matter of the state bending reality to fit its imperatives. And that’s easily done when every tech company lines up behind the White House.”
Even so, Moody remains hopeful that the current fascination with digital blackface will soon be as outdated and uninviting as the analog variant. She has seen this play before, after all. “Right now people are just experimenting with AI technology and having a ball seeing what they can get away with,” she says. “Once we get beyond that, then we’re going to see less of it. They’ll move on to something else. Or they’ll be up for a job, and it’ll be embarrassing. Just look at the history.”













Leave a Reply