YouTube is warning creators of coronavirus-related moderation changes that could impact their livelihood.
On Monday, the company announced policy enforcement changes that will come about due to the company’s response to the COVID-19 pandemic.
According to YouTube, as it prioritizes employee health, the platform will, in the short term, rely less on human moderators to review content uploaded to the site. In their place, the company will lean more heavily on its automated review systems.
“Our Community Guidelines enforcement today is based on a combination of people and technology: Machine learning helps detect potentially harmful content and then sends it to human reviewers for assessment,” reads the statement. “As a result of the new measures we’re taking, we will temporarily start relying more on technology to help with some of the work normally done by reviewers. This means automated systems will start removing some content without human review, so we can continue to act quickly to remove violative content and protect our ecosystem, while we have workplace protections in place.” Read more…