YouTube is taking another step to curb hateful and violent speech on its site.
The video streaming company said it will now take down videos that lob insults at people based on race, gender expression, sexual orientation or other "protected attributes." The Google-owned company will also prohibit veiled threats of violence, taking a step further into moderating what people can say on the videos they create and upload.
YouTube has been slapped with criticism from politicians, viewers and video creators for the material it allows on — and bans from — the site. The site has been accused of allowing and fostering hate speech and extremism and creating spaces for harassment to linger online, along with other digital sites that allow people to upload their own material, such as Facebook and Twitter.
YouTube has been reviewing its policies and guidelines for about two years, Matt Halprin, the company's vice president of trust and safety, said in an interview. He said the company tries to find a balance between allowing freedom of expression and keeping hateful speech to a minimum .
YouTube has long prohibited outright threats of violence. In June, it updated its hate speech policies to ban videos with white supremacist and Neo-Nazi viewpoints.
But the company also received significant pushback that month after it allowed a video to remain on the site from conservative commentator Steven Crowder. In the video, Crowder used homophobic slurs aimed at Vox reporter Carlos Maza.
Maza publicly criticized YouTube for its decision — but the company said it didn't violate any anti-harassment policies.
That will change today. YouTube confirmed that Crowder's videos about Maza now violate its new policies and will be removed.
YouTube also announced it would take action against channels that have been found to repeatedly harass people in videos. In many cases, it will "demonetize" the channel, YouTube said, by turning off any ad revenue those videos would normally generate for their creators.
YouTube already demonetized Crowder's channel.
The new anti-harassment and violence policies also apply to public officials, though videos will still remain on the site if they are parts of news stories, documentaries or other educational material.
YouTube is also rolling out a comment review tool to video owners that will, by default, hold back comments the company's algorithms have flagged as potentially inappropriate until creators have reviewed them. Video creators can turn off that setting if they want.