YouTube is adding more human moderators and increasing its machine learning in an attempt to curb its child exploitation problem, the company’s CEO Susan Wojcicki said in a blog post on Monday evening.

The company plans to increase its number of content moderators and others addressing content that violates company rules to more than 10,000 employees in 2018 in order to help screen videos and train the platform’s machine learning algorithms to spot and remove problematic children’s content. Sources familiar with YouTube’s workforce numbers say this represents a 25% increase from where the company is today.

In the last two weeks, YouTube has removed hundreds of thousands of videos featuring children in disturbing and possibly exploitative situations, including being duct-taped to walls, mock-abducted, and even forced into washing machines. The company said it will employ the same approach it used this summer as it worked to eradicate violent extremist content from the platform.

Though it’s unclear whether machine learning can adequately catch and limit disturbing children’s content — much of which is creepy in ways that may be difficult for a moderation algorithm to discern — Wojcicki touted the company’s machine learning capabilities, when paired with human moderators, in its fight against violent extremism.

According to YouTube, it used machine learning to remove more than 150,000 videos for violent extremism since June; such an effort “would have taken 180,000 people working 40 hours a week,” according to the company. The company also claimed its algorithms were getting increasingly better…

Continue ….

[SOURCE]