Can Wikipedia-like Citations on YouTube Curb Misinformation?
The team also found that in one case a participant misinterpreted a YouTube information panel as an endorsement of the video from the Centers for Disease Control and Prevention. But these panels are actually links to supplemental information that the site attaches to videos on “topics prone to misinformation.”
“The trouble is that a lot of YouTube videos, especially more educational ones, don’t offer a great way for people to prove they’re presenting good information,” said Emelia Hughes, a doctoral student at University of Notre Dame who completed this research as a UW undergraduate student in the Information School. “I’ve stumbled across a couple of YouTubers who were coming up with their own ways to cite sources within videos. There’s also not a great way to fight bad information. People can report a whole video, but that’s a pretty extreme measure when someone makes one or two mistakes.”
The researchers designed Viblio so users can better understand videos’ content while also avoiding things like users misinterpreting the additional information. To add a citation, users click a button on the extension. They can then add a link, select the timespan their citation references and add optional comments. They can also select the type of citation, which marks it with a colored dot in the timeline: “refutes the video clip’s claim” (red), “supports the video clip’s claim” (green) or “provides further explanation” (blue dot).
To test the system, the team had the study participants use Viblio for two weeks on a range of videos, including clips from Good Morning America, Fox News and ASAPScience. Participants could add citations as well as watch videos with other participants’ citations. For many, the added citations changed their opinion of certain videos’ credibility. But the participants also highlighted potential difficulties with deploying Viblio at a larger scale, such as the conflicts that arise in highly political videos or those on controversial topics that don’t fall into true-false binaries.
“What happens when people with different value systems add conflicting citations?” said co-author Tanu Mitra, a UW assistant professor in the Information School. “We of course have the issue with bad actors potentially adding misinformation and incorrect citations, but even when the users are acting in good faith, but have conflicting options, whose citation should be prioritized? Or should we be showing both conflicting citations? These are big challenges at scale.”
The researchers highlight a few areas for further study, such as expanding Viblio to other video platforms such as TikTok or Instagram; studying its useability at a greater scale to see whether users are motivated enough to continue adding citations; and exploring ways to create citations for videos that don’t get as much traffic and thus have fewer citations.
“Once we get past this initial question of how to add citations to videos, then the community vetting question remains very challenging,” Zhang said. “It can work. At X, Community Notes is working on ways to prevent people from ‘gaming’ voting by looking at whether someone always takes the same political side. And Wikipedia has standards for what should be considered a good citation. So it’s possible. It just takes resources.”
Stefan Milne is a public information officer at the University Washington. The article was first posted to the website of the University of Washington.