• Slow the Scroll: Users Less Vigilant About Misinformation on Mobile Phones

    Mobile phones pack a lot of information into pocket-sized devices, which is why users may want to slow down the next time they’re scrolling through social media or checking email on a mobile app. Habitual mobile phone users engage in less information processing, and are more likely to fall for misinformation on their mobile phones than on personal computers, researchers find.

  • Truth Decay and National Security

    The line between fact and opinion in public discourse has been eroding, and with it the public’s ability to have arguments and find common ground based in fact. Two core drivers of Truth Decay are political polarization and the spread of misinformation—and these are particularly intertwined in the national security arena. Exposure to misinformation leads to increased polarization, and increased polarization decreases the impact of factual information. Individuals, institutions, and the nation as a whole are vulnerable to this vicious cycle.

     

  • Jan. 6 Was an Example of Networked Incitement − a Media and Disinformation Expert Explains the Danger of Political Violence Orchestrated Over Social Media

    The shocking events of Jan. 6, 2021 were an example of a new phenomenon: influential figures inciting large-scale political violence via social media, and insurgents communicating across multiple platforms to command and coordinate mobilized social movements in the moment of action. We call this phenomenon “networked incitement.” The use of social media for networked incitement foreshadows a dark future for democracies. Rulers could well come to power by manipulating mass social movements via social media, directing a movement’s members to serve as the leaders’ shock troops, online and off.

  • How Verified Accounts on X Thrive While Spreading Misinformation About the Israel-Hamas Conflict

    With the gutting of content moderation initiatives at X, accounts with blue checks, once a sign of authenticity, are disseminating debunked claims and gaining more followers. Community Notes, X’s fact-checking system, hasn’t scaled sufficiently. “The blue check is flipped now. Instead of a sign of authenticity, it’s a sign of suspicion, at least for those of us who study this enough,” said one expert.

  • Evaluating the Truthfulness of Fake News Through Online Searches Increases the Chances of Believing Misinformation

    Conventional wisdom suggests that searching online to evaluate the veracity of misinformation would reduce belief in it. But a new study by a team of researchers shows the opposite occurs: Searching to evaluate the truthfulness of false news articles actually increases the probability of believing misinformation.

  • Shadow Play: A Pro-China and Anti-U.S. Influence Operation Thrives on YouTube

    Experts have recently observed a coordinated inauthentic influence campaign originating on YouTube that’s promoting pro-China and anti-US narratives in an apparent effort to shift English-speaking audiences’ views of those countries’ roles in international politics, the global economy and strategic technology competition.

  • The Cross-Platform Evasion Toolbox of Islamic State Supporters

    Extremists exploiting platforms for their own ends and learning along the way is a tale as old as the internet and one that has become even more pronounced in the era of ubiquitous access to social media. Moustafa Ayad writes that over the past three years, a set of exploitation and evasion tactics have become central for Islamic State supporters online, and they are only getting more elaborate.

  • Fact Check: AI Fakes in Israel's War Against Hamas

    Real or fake? Images generated by artificial intelligence have become a disinformation tool in the war between Israel and Hamas. DW’s fact-checking team shows you how to spot them.

  • It’s Not Just About Facts: Democrats and Republicans Have Sharply Different Attitudes About Removing Misinformation from Social Media

    Misinformation is a key global threat, but Democrats and Republicans disagree about how to address the problem. In particular, Democrats and Republicans diverge sharply on removing misinformation from social media.

  • Want to Prevent Misinformation? Present Data with an Interactive Visual

    Getting readers of a news story interested in numbers can be a challenge. But the benefits of engaging readers in data can lead to a better understanding, preventing misinformation, and misrepresentation in the news. New research explores a solution using interactive data visualization to inform and engage readers.

  • Tackling Fake News

    Cutting-edge technologies gave the world fake news, but researchers are developing even newer technology to stop it. Their innovative system — the first of its kind — relies on something already famous for underpinning Bitcoin and other cryptocurrencies — blockchain.

  • Israel-Hamas Conflict: Fighting Misinformation Requires Better Tools

    Misinformation about the Israel-Hamas conflict is flooding social media, in particular Elon Musk’s platform X, where users have been sharing false and misleading claims about the assault. Researchers have investigated various interventions on social media, including accuracy prompts, fact-checking or debunking, crowdsourcing and labeling or warnings.

  • Training Students to Succeed in the “Fourth Industrial Revolution”

    Transformational changes are already underway in the manufacturing industry as technological advancements, such as artificial intelligence (AI) and smart devices from the “fourth industrial revolution” or Industry 4.0., inspire a digital-first approach to engineering. University of Missouri researchers are using a $1 million grant to support the development of an Industry 4.0 lab, training engineering students for the future of digitization in manufacturing.

  • When Rumors Take Flight

    Misinformation pervades U.S. politics, with the outcome of the 2020 presidential election being the most pressing case in point as a result of the persistent, unrelenting Big Lie campaign by Donald Trump and some of his allies. Yet Trump’s lies and unfounded claims have gained wide traction among his followers. MIT professor Adam Berinsky’s new book examines the political misinformation that threatens the U.S. system of government.

  • Government Regulation Can Effectively Curb Social-Media Dangers

    Social media posts such as those that promote terrorism and hate; spread medical misinformation; encourage dangerous challenges that put teen lives at risk; or those that glamorize suicide, pose a significant threat to society. New EU rules require social media platforms to take down flagged posts within 24 hours – and modelling shows that’s fast enough to have a dramatic effect on the spread of harmful content.