YouTube talks about success in combating inappropriate content on the platform
YouTube has stated that it is better than ever at enforcing its own content moderation guidelines. Video hosting users are less and less likely to encounter inappropriate content, including scenes of violence, fraud, or incitement to hatred.
In the final months of 2020, for every 10,000 views on YouTube, up to 18 videos violated company policy and had to be removed before anyone saw them. That said, that’s far fewer than 72 of these videos for every 10,000 views in Q4 2017 when YouTube started tracking these statistics. It’s worth noting that while these numbers tell you how well YouTube is at dealing with inappropriate content, the company decides which videos it considers inappropriate. For example, the platform is quite comfortable with videos containing racist statements if they do not contain a direct call to commit violence.
The number of views on inappropriate videos before deletion is called the Violative View Rate (VVR) in YouTube’s quarterly Community Guidelines. The company has compiled a chart showing how that figure has fallen since it began to receive attention. The sharp decline in 2018 came as YouTube began to rely on machine learning to find inappropriate videos. The company says it plans to bring this figure as close to zero as possible. The YouTube team claims its top priority is to keep users safe on the platform.