Researchers say that YouTube’s efforts to crack down on videos purveying conspiracy theories hasn’t been entirely in vain.
According to a new study from experts at University of California, Berkley, the Google-owned platform has succeeded in reducing conspiracy recommendations.
The study analyzed more than 8 million video recommendations over the past 15 months and employed the use of an algorithm that rated videos – reading their title and description as well as the comment – on a scale of 0 to 1 for likelihood that it pedaled conspiracies.
To help account for errors they only counted videos that scored a .5 or higher on its scale.
Researchers found that YouTube’s claim that it had lowered conspiracy video recommendations by 50 percent in June last year to be ‘mostly consistent’ with their own analysis though that number is constantly changing.
The study says conspiracy videos are now about 40 percent less common than they were since their low point in June.
In 2018, the amount of conspiracy videos being recommended peaked with about 10 percent of all suggested content being conspiratorial in nature.
Researchers say that the platform’s efforts have varied depending on category.
For instance, they say it has all but wiped out some categories of conspiracy videos including flat earth theories and videos that claim that the US government orchestrated the September 11 terror attacks in New York City.
Other varieties of video have been allowed to remain on the platform, however, including those alleging the Great Pyramids were built be aliens and others denying human-driven climate change.
This disparity could be more of a matter of YouTube’s priorities according to researchers, with the company attempting to hash out what it qualifies as a threat to its users and the public and what should remain.
Though the efforts mark significant progress, researchers are quick to note that the YouTube’s continued efforts don’t necessarily signify a victory over conspiracies on the platform.
In fact, they say that the threat of radicalization on YouTube remains as present as ever.
‘The overall reduction of conspiratorial recommendations is an encouraging trend. Nonetheless, this reduction does not make the problem of radicalization on YouTube obsolete nor fictional, as some have claimed,’ they write.
Though the study was among the most comprehensive analyses of conspiracy videos on YouTube, researchers were not able to draw a linkage between the decline and likelihood that one might be radicalized by watching the content.
As noted by the New York Times, researchers were only able to assess videos recommended on YouTube without being logged in, meaning that personalized recommendations, which the majority of its users interface with, is still an unknown.
Until a study formulates a way to account for the personalized videos of an individual user, researchers say it will be impossible to judge the actual impacts that conspiracy videos have on radicalization.