YouTube blocks chess channel after mistaking ‘black v white’ discussion for racism

YouTube’s AI algorithm developed specifically for flagging racist content ended up committing an embarrassing gaffe as it blocked the world’s most popular YouTube chess channel, simply because it mistook a discussion about black and white chess pieces as racist content.

As reported by Independent UK, Croatian chess player Antonio Radic was dismayed to find his YouTube channel blocked during a show with Grandmaster Hikaru Nakamura. With over one million subscribers, Radic was understandably concerned. And confused. What on earth could have compelled YouTube to blacklist his channel?

He received no explanation from the video platform.

Radic’s channel was restored 24 hours later. He suspects that the account may have been blocked because he referred to the chess game as “Black against White”.

YouTube relies on both humans and AI algorithms, which means the AI system could make an error if it is not trained correctly to interpret context.

If they rely on artificial intelligence to detect racist language, this kind of accident can happen,” said Ashiqur KhudaBukhsh, a project scientist at CMU’s Language Technologies Institute.

KhudaBukhsh tested this theory by using the best speech classifier that’s available to screen 680,000 comments gathered from five popular chess-focused YouTube channels.

Facebook faces new UK class action after data harvesting scandal

After manually reviewing 1,000 comments, he found that 82 per cent of them had been wrongly categorized by AI as hate speech because the comments used words like “black”, “white”, “attack” and “threat”.

YouTube, Facebook, and Twitter warned last year that videos and content may be erroneously removed for policy violations, as the companies rely on automated takedown software during the coronavirus pandemic.

In a blog post, Google said that to reduce the need for people to come into offices, YouTube and other business divisions are temporarily relying more on artificial intelligence and automated tools to find problematic content.

Well, clearly, YouTube’s AI algorithms still have quite a long way to go.

Platonist. Humanist. Unusually edgy sometimes.

Related Posts