As a social media platform with billions of daily views every day, YouTube is one platform that people visit to be informed or catch up on the latest news or to learn more about the topics they care about.
However, the video streaming platform is often viewed as a key source for spreading misinformation and potentially harmful content.
While YouTube has been addressing the issue of misinformation and bad content on its platform based on its 4Rs of Responsibility principles, the company on Thursday also announced in a blog post its new three-tiered approach to tackle key misinformation challenges on the platform.
The first approach is to catch new misinformation before it goes viral. Although long-standing conspiracy theories such as 9/11 truthers, moon landing conspiracy theorists, and flat earthers have long been topics of conversation among users, a completely new narrative can quickly appear and gain popularity before trusted sources have a chance to deflate it.
To address this, the company plans to deploy an improved machine-learning algorithm to catch and flag misinformation before it is spread. They will also provide videos on niche topics with fact-check boxes that may not be covered by media outlets.
“We’re looking to leverage an even more targeted mix of classifiers, keywords in additional languages, and information from regional analysts to identify narratives our main classifier doesn’t catch,” Neal Mohan, YouTube’s Chief Product Officer, wrote in a blog post.
Next, there’s limiting cross-platform sharing of misinformation. The biggest challenge to deal with is the spreading of borderline videos outside of YouTube, which may have been shared on other platforms.
One possible way to address this, according to Mohan, is to disable the share button or break the link on videos, which means you couldn’t embed or link to a borderline video on another site.
Also Read – Download YouTube App for PC
“We need to be careful to balance limiting the spread of potentially harmful misinformation, while allowing space for discussion of and education about sensitive and controversial topics.”
Another possible solution suggested by Mohan is to surface an “interstitial” – like a speed bump – that makes the viewer pause before they watch a borderline embedded or linked video or share content.
YouTube is already using interstitials for age-restricted content and violent or graphic videos and considers it an important tool for giving viewers a choice in what they are about to watch.
Lastly, the company is planning to ramp up its misinformation efforts work around the world by addressing misinformation in languages other than English. For this, it is exploring further investments in partnerships with experts and non-governmental organizations around the world, and “working on ways to update models more often in order to catch hyperlocal misinformation, with capability to support local languages.”