YouTube has announced that will block users from commenting on most videos that feature minors, responding to reports that pedophiles had used comments to find, track and exploit children.

Under YouTube’s new policy, users will no longer be able to comment on videos that prominently feature kids younger than 13. YouTube said it intends to disable comments on videos including children between 13 and 18 if the content risks attracting predatory behavior.

YouTube, which is part of Google, said it would identify minors in videos using software. The new rules will take several months to implement, it added.

“We recognize that comments are a core part of the YouTube experience and how you connect with and grow your audience,” the company said in a blog post. “At the same time, the important steps we’re sharing today are critical for keeping young people safe.”

YouTube’s move comes two weeks after a video blogger documented how the site had enabled what he called a “soft-core pedophile ring.” In many cases, apparent pedophiles took advantage of YouTube’s comments system, where they would post time stamps so that others could skip ahead to moments when kids are in compromising positions. Users who viewed videos of minors also would be served up additional videos featuring children through YouTube’s recommendation engine.

Even in pledging to take new steps to limit those comments, however, YouTube faced fresh criticism from members of Congress and others who believe YouTube hasn’t done enough to deal with the array of troubling, abusive content that long has flourished on the site.

via GIPHY

“This is a good step, but YouTube still needs to be much more proactive about going after inappropriate videos and predators on its platform,” Sen. Ron Wyden, D-Ore., said in a statement.

YouTube has long struggled to monitor and remove problematic content from its huge platform, on which users upload 400 hours of content every minute. In recent years, it has faced controversies over militant extremist content, hateful conspiratorial videos and violent, sexually suggestive clips that were reaching children.

 A coalition of consumer and privacy groups filed a complaint last year with the Federal Trade Commission alleging that YouTube also is violating the nation’s child privacy law by collecting data on children younger than 13.

YouTube’s parent company, Google, is a global leader in artificial intelligence and machine learning, but YouTube has had limited success in using these technologies to keep troubling content off of its platform. 

Experts in these technologies say they work well with some readily identifiable images, such as those of nude people or known terrorist symbols. But the technologies struggle with subtle or ambiguous situations, where the human capacity for understanding underlying context of images and words is crucial.

YouTube initially responded to the reports of the pedophilic ring last week by removing tens of millions of comments, along with more than 400 channels, on videos involving minors. 

The revelations quickly triggered a sharp backlash among major brands that advertise on YouTube, including Nestlé and Disney, which suspended their ad spending on the site.

via GIPHY

On Thursday, YouTube said it accelerated its work on new software that could spot and remove predatory comments more effectively, adding it had terminated additional channels that put children at risk. The company said it would grant an exception to its new comment ban for a “small number of channels that actively moderate their comments and take additional steps to protect children.”

Since the video blogger documented how pedophiles shared time stamps of sexually suggestive moments, a parent in Florida found that a clip explaining how to commit suicide had been spliced into children’s videos on YouTube and YouTube Kids, an app specifically designed for children.

In YouTube’s Community Guidelines Enforcement Report in December, the company reported removing 7.8 million videos and 1.6 million channels in the third quarter of 2018, with 13 percent of the channel removals being for inappropriate adult content or for endangering child safety in some way. 

YouTube also said that 81 percent of videos that ended up being removed were first detected by automated systems, and that Google overall planned to have 10,000 people working on content moderation by the end of 2018.

Those investments haven’t stopped the run of controversies for the company, which some critics contend has simply become too large to manage safely and effectively — especially when it comes to protecting children.

Deleting comments on some videos is a striking retreat for YouTube, which alone among Google properties has succeeded in building what amounts to a social media platform though its comments, bringing together like-minded users on a mass scale — something Google+ and other experiments failed to do. YouTube also managed to generate extra revenue with “Super Chat” that users pay to display prominently with live-streamed videos.

But comments have proven to be at least as troublesome to YouTube as its videos. In the December report, YouTube said it had removed 224 million comments in just three months last year.

via GIPHY

-Washington Post