YouTube’s harassment ban isn't enough

The Google-owned video streaming giant might be clamping down on insults based on race, gender and sexual orientation, but it won’t change a thing.

by Alim Kheraj
16 December 2019, 2:02pm

Photo Getty Images

On 11 December 2019, YouTube announced an update to its harassment policy. The Google-owned video streaming platform said in a blog post that it would “no longer allow content that maliciously insults someone based on protected attributes such as their race, gender expression, or sexual orientation”. This extends beyond specific threats or abuse to include “veiled or implied threats” and harassment that occurs over multiple videos, even if individual clips don’t violate the company’s hate speech policy or terms of use.

The decision comes after a year of controversy for YouTube. Months ago, after being accused of being too lenient on alt-right content, they announced they had banned a number of accounts that spread supremacist and fundamentalist views, as well as a number of conspiracy videos.

The company also faced backlash after journalist and content creator Carlos Maza accused it of inaction regarding targeted harassment he had experienced from right-wing “comedian” Steven Crowder. In May this year, Carlos laid out YouTube’s failings regarding the abuse he had experienced in a viral Twitter thread in which he shared clips of Crowder attacking his sexuality and ethnicity. He also detailed how he had been doxxed by Crowder’s fans, how Crowder was selling homophobic merchandise targeting him, and also shared his frustration that, despite multiple complaints to YouTube, the platform had chosen to do nothing.

In an interview with BuzzFeed’s AM/DM morning show, Maza accused YouTube of only caring about engagement. “Steven Crowder is not the problem. Alex Jones isn’t the problem. These individual actors are not the problem,” he said. “They are symptoms and the product of YouTube’s design, which is meant to reward the most inflammatory, bigoted, and engaging performers.”

Despite the coverage that Carlos's thread garnered, as well as his clear receipts, YouTube said that Crowder’s videos, while “hurtful”, hadn’t violated their harassment and bullying policies.

Nevertheless, YouTube did quietly begin to make some changes. While the platform said they would not remove any of Crowder’s videos or ban his channel, they did revoke his ability to make advertising revenue from his uploads, which they said “a pattern of egregious actions has harmed the broader community and is against our YouTube Partner Program policies”. Crowder, they added, could re-monetise his content if he started to address the issues raised, including removing the links to offensive merchandise.

While all this was going on, YouTube was also coming under fire from LGBTQ creators, who were accusing the platform of unnecessarily censoring queer content. A number of vloggers decided to sue YouTube, which they said had broken its own policies, after they alleged that they had lost views and earnings when their videos were flagged and censored. Similarly, another group of creators tested the company’s algorithm and found that videos with LGBTQ titles of keywords were often automatically flagged as inappropriate and therefore demonetised.

Upon hearing about YouTube’s most recent policy changes, Maza, who ended up leaving the company he worked for after the abuse he suffered, said on Twitter that he was “sceptical” about the changes. “‘Malicious insults’ were already prohibited under YouTube's anti-hate and anti-harassment policies,” he wrote, adding, “YouTube rolls out policies like this to distract reporters from the real story: YouTube's non-enforcement.”

Maza reiterated that while right-wing YouTubers like Crowder and PewDiePie brought in engagement and viewers to the site, there is no real incentive for the platform to change. Likewise, demonetising certain creators for offensive material isn’t necessarily a deterrent, as creators can martyr themselves while bringing in revenue from merch sales and direct donations via platforms like Patreon.

Ultimately, what YouTube cares about is how many users are visiting the site. Indeed, in a Washington Post article that Maza shared, it was reported that moderators were often told to hold back from demonetising videos that broke the platforms terms of use if the creator had a high profile and brought in a lot of views. Moderators also complained of “constantly shifting policies and a widespread perception of arbitrary standards when it came to offensive content”.

It’s hard not to see these latest policy amendments as anything but smoke and mirrors. As BuzzFeed News argued, it’s unlikely that videos of President Trump using racial slurs or offensive targeted language towards minorities are going to be banned.

What's needed is a systematic overhaul of the platform’s algorithm, as well as proof that YouTube are willing to cut some of their biggest users for violating their policies while ensuring that minorities and marginalised groups aren’t persecuted by both the platform and by its users. However, it goes deeper, too. YouTube, it seems, is yet to realise that the problem isn’t just targeted harassment but also the wider dissemination of racist, anti-LGBTQ and sexist content. While they continue to support, justify or ignore that content’s existence then they are not doing enough.