March 28, 2024, 19:36

How TikTok’s hate speech detection tool set off a debate about racial bias on the app

How TikTok’s hate speech detection tool set off a debate about racial bias on the app

This story is part of a group of stories called

Uncovering and explaining how our digital world is changing — and changing us.

“This is why I’m pissed the fuck off. We’re tired,” said popular Black influencer Ziggi Tyler in a recent viral video on TikTok. “Anything Black-related is inappropriate content,” he continued later in the video.

Tyler was expressing his frustration with TikTok about a discovery he made while editing his bio in the app’s Creator Marketplace, which connects popular account holders with brands who pay them to promote products or services. Tyler noticed that when he typed phrases about Black content in his Marketplace creator bio, such as “Black Lives Matter” or “Black success,” the app flagged his content as “inappropriate.” But when he typed in phrases like “white supremacy” or “white success,” he received no such warning.

For Tyler and many of his followers, the incident seemed to fit within a larger pattern of how Black content is moderated on social media. They said it was evidence of what they believe is the app’s racial bias against Black people; some urged their followers to leave the app while others tagged TikTok’s corporate account and demanded answers. Tyler’s original video about the incident has received over 1.2 million views and over 25,000 comments; his follow-up video has received another nearly 1 million views.

“I’m not going to sit here and let that happen,” Tyler, a 23-year-old recent college graduate from Chicago, told Recode. “Especially on a platform that makes all these pages saying things like, ‘We support you, it’s Black history month in February.’”

A spokesperson for TikTok told Recode that the issue was an error with its hate speech detection systems that it is actively working to resolve and that it is not indicative of racial bias. TikTok’s policies do not restrict posting about Black Lives Matter, according to a spokesperson.

In this instance, TikTok told Recode that the app is mistakenly flagging phrases like “Black Lives Matter” because its hate speech detector is triggered by a combination of words involving the words “Black” and “audience” — because “audience” contains the word “die” in it.

“Our TikTok Creator Marketplace protections, which flag phrases typically associated with hate speech, were erroneously set to flag phrases without respect to word order,” a company spokesperson said in a statement. “We recognize and apologize for how frustrating this was to experience, and our team is working quickly to fix this significant error. To be clear, Black Lives Matter does not violate our policies and currently has over 27B views on our platform.” TikTok says it has reached out to Tyler directly and that he hasn’t responded.

But Tyler said he didn’t find TikTok’s explanation to Recode to be adequate and that he felt the company should have identified an issue in its hate speech detection system sooner.

“Regardless of what the algorithm is and how it picked up, somebody had to program that algorithm,” Tyler told Recode. “And if [the problem] is the algorithm, and the marketplace has been available since [2020], why wasn’t this a conversation you had with your team, knowing there have been racial controversies?” he asked.

Tyler isn’t alone in his frustration. He’s one of many Black creators who have been protesting TikTok recently because they say they are unrecognized and underserved. Many of these Black TikTokers are participating in what they’re calling the “#BlackTikTok Strike,” in which they are refusing to make up original dances to a hit song because they are angry that Black artists on the app are not being properly credited for the viral dances that they first choreograph and that other creators imitate.

These issues also connect to another criticism that’s been leveled at TikTok, Instagram, YouTube, and other social media platforms over the years: That their algorithms, which recommend and filter the posts everyone sees, often have inherent racial and gender biases.

In 2019, for example, a study showed that leading AI models for detecting hate speech are 1.5 times more likely to flag tweets written by African Americans as “offensive” compared to other tweets.

Findings like that have fostered an ongoing debate about the merits and potential harms that come with relying on algorithms — particularly developing AI models — to automatically detect and moderate social media posts.

Major social media companies like TikTok, Google, Facebook, and Twitter, though they acknowledge that these algorithmic models can be flawed, are still making them a key part of their rapidly expanding hate speech detection systems. They say they need a less labor-intensive way to keep up with the ever-expanding volume of content on the internet.

Tyler’s TikTok video also shows the tensions surrounding these apps’ lack of transparency about how they police content. In June 2020, during Black Lives Matter protests across the US, some activists accused TikTok of censoring certain popular #BlackLivesMatter posts — which for a time the app showed as having zero views even when they had billions of views. TikTok denied this and said it was a technical glitch also affecting other hashtags. And in late 2019, TikTok executives were reportedly discussing tamping down political discussion on the app, according to Forbes, to avoid political controversy.

A spokesperson for TikTok acknowledged larger frustrations about Black representation on the platform and said that, earlier this month, the company launched an official @BlackTikTok account to help foster the Black TikTok community and that overall, its teams are committed to developing recommendation systems that reflect inclusivity and diversity.

But for Tyler, the company has a lot more work to do. “This instance is just the tip of the iceberg and underneath the water level you have all of these issues,” said Tyler.

Will you support Vox’s explanatory journalism?

Millions turn to Vox to understand what’s happening in the news. Our mission has never been more vital than it is in this moment: to empower through understanding. Financial contributions from our readers are a critical part of supporting our resource-intensive work and help us keep our journalism free for all. Please consider making a contribution to Vox today from as little as $3.

Sourse: vox.com

Related posts

Leave a Reply

Your email address will not be published. Required fields are marked *