Approximately 500 hours of video are uploaded to YouTube every minute of every day. That makes moderation of uploaded content a challenge—although a challenge that it’s in the interests of YouTube to try and meet.
To that end, YouTube has begun sharing information about a metric it calls the Volatile View Rate. This is a measure of the percentage of video views on YouTube that come from videos that violate YouTube’s policies.
The Volatile View Rate
Information about the Volatile View Rate will be shared quarterly in YouTube’s Community Guidelines Enforcement Report.
YouTube reportedly created its Volatile View Rate measure in 2017, although it will now be more transparent when it comes to sharing this information. While it’s going to be tough to totally clean up YouTube overnight, the hope is that sharing this Volatile View Rate information in this manner will show a steady decrease in these video views over time.
In a blog post, Jennifer O’Connor, YouTube’s director of Trust and Safety observes that:
“The most recent [Volatile View Rate] is at 0.16-0.18% which means that out of every 10,000 views on YouTube, 16-18 come from violative content. This is down by over 70% when compared to the same quarter of 2017, in large part thanks to our investments in machine learning.”
The report continues that YouTube has, to date, removed more than 83 million videos and 7 billion comments that violated its Community Guidelines. O’Connor observes that, using its AI-aided algorithms, the popular video platforming is now able to detect 94% of content that violates its rules using automatic flagging. Three-quarters of this content is removed before it’s even able to rack up 10 views.
Volatile View Rate isn’t the only metric YouTube uses for assessing its success at removing violating content. It also uses data related to turnaround time when it comes to removing content. But, as O’Connor observes, this is not a perfect metric. She writes:
“For example, compare a violative video that got 100 views but stayed on our platform for more than 24 hours with content that reached thousands of views in the first few hours before removal. Which ultimately has more impact? We believe VVR is the best way for us to understand how harmful content impacts viewers and to identify where we need to make improvements.”
Making YouTube Work for Everyone
There’s still plenty more work that needs to be done to ensure that platforms like YouTube are inclusive, non-harmful places for as many users as possible. The company continues to tweak its rules regarding what is, and isn’t, allowed.
Nonetheless, work like this shows that YouTube is doing its best to move things in the right direction.