Editorial Note: Opinions are the writer’s own and not those of AfroTech.

Artificial intelligence (AI) has quickly and deeply infiltrated our daily lives — beyond simply getting the weather from smart speakers to now taking orders at local drive-thru restaurants. AI has become a part of our day-to-day routines at a speed many of us could not have predicted. 

Now that the Writer’s Guild of America (WGA) strike is well underway, along with several concerning headlines emerging about how generative AI adoption has outpaced our ethics around it, there should also be concern about how AI will continue to disrupt our media landscape — and representation in particular. Indeed, enacting parameters around the use of AI is part of the WGA strike demands, including regulating the use of AI technology to prevent AI from being used as source material, or to write/ rewrite literary material, among other tasks the technology could complete instead of actual writers.

As Misa Makwakwa Masokameng outlined for Afropunk, a ChatGPT generated scene featuring an African American mother and child discussion of Rosa Parks isn’t quite it at first pass, but with the right tweaks and human expertise could turn into a moving piece.

“AI-generated works affect all writers,” Masokameng wrote. “However, for the Black writers already sidelined by structural and institutional racism, the implications of AI widening the racial divide in the writers’ room are grave.”

In fact, for a Nielsen Diversity, Equity and Inclusion research project, we looked at the impact of having Black women in the writers room for the most representative dramas on TV for Black women. It wasn’t just about visual representation on screen for these successful programs. The most representative TV dramas featuring Black women on screen also averaged 15% Black women writers in their program credits. There was a clear difference in context for how these inclusive programs presented Black women—with themes like justice, power and glamor compared to how Black women were most often seen on TV with themes such as competition and rivalry. 

History is one thing, but could AI really tell the stories of today’s civil rights abuses and struggles? Can it tell those stories with nuance and care? Eventually, the answer is probably, “yes.” 

But the economics of “just get AI to do it” will have an outsized impact on Black talent, particularly writers. Research from the Think Tank for Inclusion & Equity shows that writers from historically-excluded groups are most likely to be in the lowest level staff writing positions. WGA puts the percentage of Black writers in the industry at 15.5%, but according to Nielsen data, the share of cast for Black people is about 21% in TV programming. 

That means the ratio of non-Black storytellers, journalists or commentators telling Black stories is already skewed. AI could take away even more of the precious few opportunities for Black people to learn, advance and succeed in the industry. And when the cost saving cuts to staff come around, Black talent won’t even be in the room for the option to stay, strike or be let go. ​​

So, who would be left to tweak and train the model? Will there be enough empowered representation in the room to recognize stereotypical tropes, let alone steer AI away from them? Executives already temper how far writers can go with their content, especially in themes of addressing or combating racism or social justice. AI may learn to stop pitching those scenes, those characters, and those storylines altogether, among other crucial missteps that could affect diversity and representation. That’s particularly jarring when Nielsen data already shows that Black audiences are also most likely to seek out diverse content, yet the majority of Black audiences don’t feel represented enough in what they view. 

The media industry should take a hard look at the implications — both good and bad — that AI has, not just on the future, but the very real impact it’s having today.

Charlene Polite Corley is vice president of diverse insights and partnerships at Nielsen. As a part of the Diversity, Equity, & Inclusion Team, Charlene’s research and thought leadership are used to amplify the power and influence of Black audiences, showcase media trends among historically excluded communities, and support both the social and business case that representation matters.