• Social Media Makes It Difficult to Identify Real News

    There is a price to pay when you get your news and political information from the same place you find funny memes and cat pictures, new research suggests. The study found that people viewing a blend of news and entertainment on a social media site tended to pay less attention to the source of content they consumed – meaning they could easily mistake satire or fiction for real news.

  • Uncertainty about Facts Can Be Reported Without Damaging Public Trust in News: Study

    The numbers that drive headlines – those on Covid-19 infections, for example – contain significant levels of uncertainty: assumptions, limitations, extrapolations, and so on. Experts and journalists have long assumed that revealing the ‘noise’ inherent in data confuses audiences and undermines trust. A series of experiments – including one on the BBC News website – finds the use of numerical ranges in news reports helps us grasp the uncertainty of stats while maintaining trust in data and its sources.

  • Combating the Coronavirus Infodemic: Is Social Media Doing Enough?

    The global coronavirus pandemic has also spawned an epidemic of online disinformation, ranging from false home remedies to state-sponsored influence campaigns. To stem the growing “infodemic,” social media platforms have moved quickly to quash disinformation on their platforms. Their response represents the strongest attempts to police disinformation to date, though actual results have been mixed.

  • Journalism Is an “Attack Surface” for Those Spreading Misinformation

    For all the benefits in the expansion of the media landscape, we’re still struggling with the spread of misinformation—and the damage is especially worrisome when it comes to information about science and health. “Believing things that aren’t true when it comes to health can be not just bad for us, but dangerous,” said one expert.

  • Faster Way to Replace Bad Data with Accurate Information

    Research have demonstrated a new model of how competing pieces of information spread in online social networks and the Internet of Things (IoT). The findings could be used to disseminate accurate information more quickly, displacing false information about anything from computer security to public health.

  • Why Does Russia Use Disinformation?

    There is much discussion about Russian disinformation in today’s popular discourse, but the conversation about why Russia uses disinformation usually does not get beyond general notions of Moscow wanting to “divide us” or “muddy the waters.” Kasey Stricklin writes that this is dangerous and incorrect thinking, because, in fact, “Russia has a number of strategic goals that it hopes to advance through its use of disinformation, including restoring Russia to great power status, preserving its sphere of influence, protecting the Putin regime and enhancing its military effectiveness.

  • In Politics and Pandemics, Russian Trolls Use Fear, Anger to Drive Clicks

    Facebook users flipping through their feeds in the fall of 2016 faced a minefield of Russian-produced targeted advertisements pitting blacks against police, southern whites against immigrants, and gun owners against Obama supporters. The cheaply made ads were full of threatening, vulgar language, but according to a sweeping new analysis, they were remarkably effective, eliciting clickthrough rates as much as nine times higher than what is typical in digital advertising. The Kremlin-sponsored troll farms are still at it, already engaged in disinformation campaigns around COVID-19.

  • Experts: Russia Using Virus Crisis to Sow Discord in West

    Experts say that Kremlin’s disinformation specialists are behind a disinformation campaign in the Western media on coronavirus, intended to fuel panic and discord among allies, deepen the crisis, exacerbate its consequences, and hamper the ability of Western democracies to respond to it effectively. The European Union has accused Moscow of pushing fake news online in English, Spanish, Italian, German and French, using “contradictory, confusing and malicious reports” to make it harder for the bloc leaders to communicate its response to the COVID-19 pandemic.

  • Truth Decay in the Coronavirus Moment: Q&A with Jennifer Kavanagh

    The COVID-19 crisis “is the type of environment in which false and misleading information thrives and spreads quickly. People are vulnerable. People are afraid. People don’t know what to believe. Trust in basically every organization or position that we would turn to is pretty low. There’s higher trust in the medical community than in, say, media or government, but it’s still not all that high. The combination of low trust and high volume of information coming from people who are not experts—but purport to be experts—creates the perfect storm for the average person,” says Jennifer Kavanagh, author of Truth Decay.

  • The Catch to Putting Warning Labels on Fake News

    After the 2016 U.S. presidential election, Facebook began putting warning tags on news stories fact-checkers judged to be false. But there’s a catch: Tagging some stories as false makes readers more willing to believe other stories and share them with friends, even if those additional, untagged stories also turn out to be false.

  • Facebook, Twitter Remove Russia-Linked Fake Accounts Targeting Americans

    Social-media giants Facebook and Twitter say they have removed a number of Russia-linked fake accounts that targeted U.S. users from their operations in Ghana and Nigeria. Facebook on 12 March said the accounts it removed were in the “early stages” of building an audience on behalf of individuals in Russia, posting on topics such as black history, celebrity gossip, and fashion.

  • Extremists Use Coronavirus to Advance Racist, Conspiratorial Agendas

    As the number of confirmed cases of coronavirus surges globally, extremists continue to use the virus  to advance their bigotry and anti-Semitism, while also promoting conspiracy theories and even boogaloo (the white supremacist term for civil war). As usual, extremists are relying primarily on fringe social media platforms to disseminate their views, but as the virus spreads, it has gotten easier to find xenophobia, anti-Semitism and conspiracy theories on mainstream social media platforms.

  • Chinese and Russian State-Owned Media on the Coronavirus: United Against the West?

    Beginning in late January, when news emerged of a “novel coronavirus” spreading through China, Beijing’s propaganda apparatus shifted into overdrive. The epidemic has also been heavily covered in externally directed Russian state-backed media outlets, offering an opportunity to compare and contrast the approaches of both countries’ propaganda apparatuses.

  • Better Math to Help Stop Spread of False Rumors about COVID-19

    Think of all the false rumors that went viral about COVID-19—it got so bad, the World Health Organization called it an “infodemic.” Whether it is in hoaxes or a viral conspiracy theory, information travels fast these days. Just how fast and far information moves depends on who shares it, and where, from discussions on social media to  conversations with fellow commuters on your way to work. So, how can our interactions and their infrastructures affect the spread of rumors and information? That’s a question that researchers are beginning to answer with complex math models of social contagion, the concept that social behavior and ideas spread like a pathogen.

  • U.S. Accuses Russia of Spreading Fear, Panic on Coronavirus

    The United States is accusing Russia of opening up its entire disinformation playbook to prey on growing fears about the spread of the coronavirus. Moscow’s effort, underway for weeks, according to officials, includes the use of state-run media outlets, fake news websites and “swarms” of fake online personas to churn out fabricated information in at least five languages.