A new study finds that YouTube's plan to stop recommending conspiracy theories in its regular video feed is working.
Background information: Following criticism of the promotion of conspiracy theories (miracle cures, the Earth is flat, etc.), YouTube announced in January 2019 that it would take stricter action against such 'transgressive content'.
Where We Are Now: Researchers from the University of California, Berkeley, and the Mozilla Foundation developed a system to classify whether a video is “conspiratorial,” then emulated YouTube’s Watch-Next algorithm to filter through a year’s worth of what the algorithm would actively promote. Marc Faddoula, Guillaume Chaslotb, and Hany Farida found that there has actually been a reduction in the number of conspiracy theory videos being actively recommended.
This isn’t solved: While the researchers are cautiously optimistic, they realize that the problem of radicalization by such videos is a larger one. “Those with a history of watching conspiracy content may still experience YouTube as a filter bubble,” they wrote, “amplified by personalized recommendations and channel subscriptions.”