How TikTok’s hate speech detection tool started a debate about racial prejudice in the app


“That’s why I pissed.” We are tired, ”said popular black influencer Ziggi Tyler in a recent viral video on TikTok. “Everything related to Black is inappropriate content,” he continued later in the video.

Tyler expressed his frustration with TikTok for a discovery he made while editing his biography in the App Creators Marketplace, which connects popular account holders with brands that pay them to promote products or services. Tyler noted that when he wrote phrases about Black content in his Marketplace creator biography, such as “Black Lives Matter” or “Black success,” the app marked his content as “inappropriate.” . But when he wrote in phrases like “white supremacy” or “white success,” he did not receive such a warning.

For Tyler and several of his followers, the incident seemed to fit a larger pattern of how black content is moderate on social media. They said it was proof of what they believe is the racial bias of the app against Blacks – and some have ordered their followers to leave the app, while others have tagged TikTok’s corporate account and demanded answers. Tyler’s original video on the incident received more than 1.2 million views and more than 25,000 comments; his video below received another nearly 1 million views.

“I’m not going to sit here and let that happen,” he told Recode Tyler, a recent 23-year-old college graduate from Chicago. “Especially on a platform that does all these pages saying things like, ‘We support you, it’s Black History Month in February.'”

A TikTok spokesman told Recode that the problem was a bug with its hate speech detection systems that are actively working to resolve it, and that it is not indicative of racial prejudice. According to a spokesperson, TikTok’s policies do not limit publication on the Black Lives Matter material.

In this case, TikTok told Recode that the app erroneously marks phrases like “Black life counts” because its hate speech detector is triggered by a combination of words involving the words “Black” and ” audience “- because” audience “contains the word” More “in it.

“Our TikTok Creator Marketplace protections, that flag phrases typically associated with hate speech, have been misplaced to flag phrases without respect to word order,” a company spokesman said in a statement statement. “We acknowledge and apologize for how frustrating this experience is, and our team is working quickly to resolve this significant error. To be clear, Black Lives Matter does not violate our policies and currently has more than 27B views on our platform.” TikTok says he reached out to Tyler directly, and that he did not respond.

But Tyler said he didn’t find TikTok’s explanation to Recode to be appropriate, and that he thought the company should identify a problem in its hate speech detection system first.

“Regardless of what the algorithm is and how it took place, someone had to program that algorithm,” Tyler told Recode. “And yes.” [the problem] is the algorithm, and the market is available ever since [2020], why wasn’t it a conversation you had with your team, knowing that there were racial controversies? He asked.


Tyler isn’t alone in his frustration – he’s just one of the many black creators who have protested TikTok recently because they say they aren’t recognized and underestimated. Many of these Black TikTokers participate in what they call and “#BlackTikTok Strike”, in which they refuse to invent original dances for a hit song – because they are furious that the black artists in the app have not been properly credited for the viral dances they first choreographed and that other creators imitate .

These questions also connect to another criticism that has been made of TikTok, Instagram, YouTube, and other social media platforms over the years: That their algorithms, which recommend and filter posts that everyone sees, often have inherent racial and gender prejudices.

In 2019, a study showed that The main AI models for hate speech detection are 1.5 times more likely to point out tweets written by African Americans as “offensive” compared to other tweets, for example.

Findings like these have encouraged an ongoing debate about the merits and potential harms that come with relying on algorithms – particularly developing AI models – to automatically detect and moderate posts on social media.

Major social media companies such as TikTok, Google, Facebook and Twitter – although they recognize that these algorithmic models may be flawed – still make them a key part of their rapidly expanding hate speech detection systems. They say they need a less labor-intensive way to keep pace with the ever-increasing volume of content on the Internet.

Tyler’s TikTok video also shows the tensions around the lack of transparency of these apps on how to clean up content. In June 2020 during the Black Lives Matter protests across the U.S., some activists have accused TikTok of censoring some popular #BlackLivesMatter messages – which for a time the app showed showed no view even when they had billions of views. TikTok denied this and he said it was a technical issue that also influenced other hashtags. And at the end of 2019, TikTok executives were referred discussing political debates on the app, according to Forbes, to avoid political controversy.

A TikTok spokesman acknowledged greater frustrations over the representation of Blacks in TikTok and said that earlier this month, the company launched an official @BlackTikTok account to help promote the Black TikTok community on the platform. – and that in general, their teams are committed to developing recommendation systems that reflect inclusiveness and diversity.

But for Tyler, the company has a lot more work to do. “This example is just the tip of the iceberg and below the water level you have all these problems,” Tyler said.

Source link


Read More

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button