YouTube to enlist 10000 workers to moderate content

YouTube to counter extremist content with 10000 staff

"We are also taking aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether", she said.

The human reviewers will be responsible for removing videos which violate the terms and conditions of the site, and the team will also work together to teach computers in picking troublesome videos, confirmed Susan Wojcicki, CEO of YouTube in a blog post.

In November, confectionary maker Mondelez (NASDAQ:MDLZ), Lidl, Mars and other consumer goods producers joined the boycott after The Times newspaper found YouTube was showing clips of scantily clad children in conjunction with the ads major brands. By training those algorithms to do the same for other types of videos, such as those questionable uploads that targeted children, the platform will be able to take them down a lot faster than it now can. Some 250 advertisers earlier this year also said they would boycott YouTube because of extremist videos that promoted hate and violence.

YouTube plans to hire more people to scour its website for violent videos after reports about disturbing footage on its app for children surfaced last month, prompting a slew of advertisers to abandon the site.

In the highly contentious realm of advertising, Wojcicki says the company will apply stricter criteria, conduct more manual curation, and ramp up its ad reviewer team to ensure that campaigns aren't appearing next to offending videos. Equally, we want to give creators confidence that their revenue won't be hurt by the actions of bad actors. In the next year, YouTube plans to have more than 10,000 reviewers on staff.

YouTube says its machine-learning algorithms help take down 70 percent of violent extremist content within eight hours of upload. Since June, moderators have manually reviewed about 2 million videos for violent extremist content, along with training machine-learning systems to identify similar objectionable content.

She said adding more people to identify inappropriate content will provide more data to supply and potentially improve its machine learning software.

"We will be talking to creators over the next few weeks to hone this new approach", YouTube CEO said.