• Fake videosVideo Fake News: Believed More, Shared More Than Text, Audio Versions

    People are more likely to believe fake news in a video format compared to text and audio forms of the same story. People are also more willing to share these videos with people in their network.

  • Truth decayFact-Checking Works Across the Globe

    Researchers found that fact-checking, aiming to reduce false beliefs, worked with little variation in Argentina, Nigeria, South Africa and the U.K., and the positive effects were still detectable two weeks later.

  • Epidemic early warningModel Predicts COVID-19 Outbreak Two Weeks Ahead of Time

    People’s social behavior, reflected in their mobility data, is providing scientists with a way to forecast the spread of COVID-19 nationwide at the county level. The data-driven deep learning model which FAU researchers developed has important implications for managing the current pandemic as well as future pandemics.

  • ExtremismSocial Media Platforms Do Little to Limit Online Anti-Semitic Content

    A new report shows how social media companies fail to act on anti-Jewish hate on their platforms. As a result of their failure to enforce their own rules, social media platforms like Facebook have become safe places to spread racism and propaganda against Jews.

  • BOOKSHELF: Health misinformationNew Book Helps Readers Spot Online Health Scams

    UBC’s Dr. Bernie Garrett, the author of a new book on health scams, misinformation, and disinformation, says that “Scam marketers are well-versed in modern advertising techniques and the psychology of persuasion. They know all the triggers that can help sell a product.” He adds that, during the COVID period, such scams “definitely have proliferated, and this has been aided by social media… Unfortunately, people can post misinformation on social media with no real consequences.”

  • Foreign disinformationCombating Foreign Disinformation on Social Media

    How are other nations using disinformation on social media to advance their interests? What is the U.S. response to these campaigns, and how has it evolved? What does the Joint Force—and the U.S. Air Force in particular—need to be prepared to do in response?

  • Truth decayOn the Internet, Nobody Knows You’re a Dog – or a Fake Russian Twitter Account

    Legacy media outlets played an unwitting role in the growth of the four most successful fake Twitter accounts hosted by the Russian Internet Research Agency (IRA) which were created to spread disinformation during the 2016 U.S. presidential campaign.

  • Predicting turmoilThe Storywrangler: Exploring Social Media Messages for Signs of Coming Turmoil

    Scientists have invented an instrument to peer deeply into the billions and billions of posts made on Twitter since 2008, and have begun to uncover the vast galaxy of stories that they contain looking for patterns which would help predict political and financial turmoil.

  • Health misinformationSurgeon General Urges ‘Whole-of-Society’ Effort to Fight Health Misinformation

    By Molly Galvin

    “Misinformation is worse than an epidemic: It spreads at the speed of light throughout the globe, and can prove deadly when it reinforces misplaced personal bias against all trustworthy evidence,” said National Academy of Sciences President Marcia McNutt. “Research is helping us combat this ‘misinfodemic’ through understanding its origins and the aspects of human nature that make it so transmittable.”

  • ARGUMENT: China’s influence campaignsHolding the Line: Chinese Cyber Influence Campaigns After the Pandemic

    While the American public became more aware of Chinese cyber influence campaigns during the 2020 COVID-19 outbreak, they did not start there – and they will not end there, either. Maggie Baughman writes that as the world’s attention returns to the origins of the global pandemic and recommits to its containment, the United States must prepare for inevitable shifts in Chinese methods and goals in its cyber influence activities – “likely beyond what Western countries have previously experienced in dealing with China”

  • PandemicSocial Media Use One of Four Factors Related to Higher COVID-19 Spread Rates Early On

    Researchers showed that, in the early stages of the pandemic, there was a correlation between social media use and a higher rate of COVID spread. The researchers compared 58 countries and found that higher social media use was among the four factors driving a faster and broader spread. Accounting for pre-existing, intrinsic differences among countries and regions would help facilitate better management strategies going forward.

  • Truth decayDeveloping Research Model to Fight Deepfakes

    Detecting “deepfakes,” or when an existing image or video of a person is manipulated and replaced with someone else’s likeness, presents a massive cybersecurity challenge: What could happen when deepfakes are created with malicious intent? Artificial intelligence experts are working on a new reverse-engineering research method to detect and attribute deepfakes.

  • China watchChina's Internet Trolls Go Global

    By Ryan Fedasiuk

    Chinese trolls are beginning to pose serious threats to economic security, political stability, and personal safety worldwide. The CCP-backed trolls have become more than a nuisance, and the magnitude and frequency of their attacks will likely continue to increase. Formulating an effective response will require understanding their size, tactics, and mission as the CCP widens the scope of its public opinion war to include foreign audiences.

  • Truth decayGhosts in the Machine: Malicious Bots Spread COVID Untruths

    By Mary Van Beusekom

    Malicious bots, or automated software that simulates human activity on social media platforms, are the primary drivers of COVID-19 misinformation, spreading myths and seeding public health distrust exponentially faster than human users could, suggests a new study.

  • Truth decayOverconfidence in Identifying False News Makes One More Susceptible to It

    A new study finds that individuals who falsely believe they are able to identify false news are more likely to fall victim to it. “Though Americans believe confusion caused by false news is extensive, relatively few indicate having seen or shared it,” said one researcher. “If people incorrectly see themselves as highly skilled at identifying false news, they may unwittingly be more likely to consume, believe and share it, especially if it conforms to their worldview.”