YouTube will temporarily increase automated content moderation

Christine Fisher
Contributing Writer
Anatoliy Sizov via Getty Images

YouTube will rely more on machine learning and less on human reviewers during the coronavirus outbreak. Normally, algorithms detect potentially harmful content and send it to human reviewers for assessment. But these are not normal times, and in an effort to reduce the need for employees and contractors to come into an office, YouTube will allow its automated system to remove some content without human review.

YouTube admits that this could lead to increased video removals, including some videos that do not violate its policies. YouTube already has a touchy relationship with creators, so potentially removing valid videos might not go over well. YouTube won't issue strikes on the content, unless it is clearly in violation, and creators can appeal content takedowns.

YouTube warns, though, that "our workforce precautions will also result in delayed appeal reviews." This may become more common in the weeks ahead. Earlier today, Google told developers that Play Store app reviews may also be delayed.

YouTube says it will also be more cautious about what content gets promoted, including livestreams. Because the situation is changing so rapidly, the policies will likely change, too. The platform initially demonetized all videos that mentioned the coronavirus, but then agreed to enable ads on a limited number of channels that discuss the outbreak. The UK government has now enlisted influencers to help battle coronavirus misinformation.

"We recognize this may be a disruption for users and creators, but know this is the right thing to do for the people who work to keep YouTube safe and for the broader community," YouTube wrote on its Creator Blog. "We appreciate everyone's patience as we take these steps during this challenging time."