• Social Media and Vaccine Misinformation

    People who rely on social media for information were more likely to be misinformed about vaccines than those who rely on traditional media, according to a new study of vaccine knowledge and media use. The researchers found that up to 20 percent of respondents were at least somewhat misinformed about vaccines. Such a high level of misinformation is “worrying” because misinformation undermines vaccination rates, and high vaccination rates are required to maintain community immunity, the researchers said.

  • Russia Knows Just Who to Blame for the Coronavirus: America

    The coronavirus outbreak has been accompanied by an avalanche of conspiracy theories about the outbreak. “But in Russia the misinformation has been particularly pointed. Russia’s spin doctors have capitalized on the fear and confusion of the epidemic to point the blame at the United States,” Amy McKinnong writers. McKinnon notes that the Russian messaging fits a now well-established pattern in that it doesn’t look to persuade audiences of a single alternative truth, because “That would take effort, planning, and persuasion.” Rather, Kremlin propaganda specialists produce “a steady stream of underdeveloped, sometimes contradictory conspiracy theories intended to exhaust and confuse viewers, making them question the very notion of objective truth itself.”

  • White Supremacist Propaganda Distribution Hit All-Time High in 2019

    White supremacist propaganda distribution more than doubled in 2019 over the previous year, making it the highest year on record for such activity in the United States. The data in a new report shows a substantial increase of incidents both on- and off-campus. A total of 2,713 cases of literature distribution – an average of more than four per day – were reported nationwide, compared to 1,214 in 2018. This is nearly 160 percent increase in U.S. campus propaganda incidents during the fall semester.

  • Digital Authoritarianism: Finding Our Way Out of the Darkness

    From Chinese government surveillance in Hong Kong and Xinjiang to Russia’s sovereign internet law and concerns about foreign operatives hacking the 2020 elections, digital technologies are changing global politics — and the United States is not ready to compete, Naazeen Barma, Brent Durbin, and Andrea Kendall-Taylor write. The United States and like-minded countries must thus develop a new strategic framework to combat the rise of high-tech illiberalism, but “as a first step, U.S. government officials need to understand how authoritarian regimes are using these tools to control their populations and disrupt democratic societies around the world.”

  • Bioweapons, Secret Labs, and the CIA: Pro-Kremlin Actors Blame the U.S. for Coronavirus Outbreak

    The Russia (earlier: Soviet) practice of spreading disinformation about public health threats is nothing new. During the Cold War, for example, a Soviet disinformation campaign blamed the United States for the AIDS virus. While epidemiologists work to identify the exact source of the Wuhan2019-nCov outbreak, pro-Kremlin actors are already blaming the United States for supposedly using bioweapons to disseminate the virus.

  • QAnon-ers’ Magic Cure for Coronavirus: Just Drink Bleach!

    QAnon, a fervently pro-Trump conspiracy theory which started with a series of online posts in October 2017 from an anonymous figure called “Q,” imagines a world where Donald Trump is engaged in a secret and noble war with a cabal of pedophile-cannibals in the Democratic Party, the finance industry, Hollywood, and the “deep state.” Will Sommer writes as the global death toll from an alarming new coronavirus surged this week, promoters of the QAnon conspiracy theory were urging their fans to ward off the illness by purchasing and drinking dangerous bleach.

  • Is There a Targeted Troll Campaign Against Lisa Page? A Bot Sentinel Investigation

    “Homewrecker.” “Traitor.” “Tramp.” These are just some of the insults flung at Lisa Page—the former FBI lawyer whom President Trump has targeted for her text messages critical of him during the 2016 election—in the almost 4,000 responses to a tweet she posted on 18 January. “Public figures often receive online abuse, after all. “But the replies to Page’s tweet stand out. They likely represent a targeted trollbot attack—one that nobody has reported on until now,” Christopher Bouzy, the founder and CEO of Bot Sentinel, writes. The troll attack on Page “looks a lot like the coordinated campaigns we witnessed during the 2016 election, when a swarm of accounts would suddenly begin tweeting the same toxic messaging. All this raises a question: Who is behind the apparent trollbot activity against Page?”

  • Artificial Intelligence and the Manufacturing of Reality

    The belief in conspiracy theories highlights the flaws humans carry with them in deciding what is or is not real. The internet and other technologies have made it easier to weaponize and exploit these flaws, beguiling more people faster and more compellingly than ever before. It is likely artificial intelligence will be used to exploit the weaknesses inherent in human nature at a scale, speed, and level of effectiveness previously unseen. Adversaries like Russia could pursue goals for using these manipulations to subtly reshape how targets view the world around them, effectively manufacturing their reality. If even some of our predictions are accurate, all governance reliant on public opinion, mass perception, or citizen participation is at risk.

  • How Russia May have Used Twitter to Seize Crimea

    Online discourse by users of social media can provide important clues about the political dispositions of communities. New research suggests it can even be used by governments as a source of military intelligence to estimate prospective casualties and costs incurred from occupying foreign territories. New research shows real-time social media data may have been a source of military intelligence for the Kremlin and potentially other governments.

  • “Like” at Your Own Risk

    New “Chameleon” Attack Can Secretly Modify Content on Facebook, Twitter. or LinkedIn: That video or picture you “liked” on social media of a cute dog, your favorite team or political candidate can actually be altered in a cyberattack to something completely different, detrimental and potentially criminal.

  • Researcher Tests “Vaccine” Against Hate

    Amid a spike in violent extremism around the world, a communications researcher is experimenting with a novel idea: whether people can be “inoculated” against hate with a little exposure to extremist propaganda, in the same manner vaccines enable human bodies to fight disease.

  • "Redirect Method": Countering Online Extremism

    In recent years, deadly white supremacist violence at houses of worship in Pittsburgh, Christchurch, and Poway demonstrated the clear line from violent hate speech and radicalization online to in-person violence. With perpetrators of violence taking inspiration from online forums, leveraging the anonymity and connectivity of the internet, and developing sophisticated strategies to spread their messages, the stakes couldn’t be higher in tackling online extremism. Researchers have developed the Redirect Method to counter white supremacist and jihadist activity online.

  • YouTube’s Algorithms Might Radicalize People – but the Real Problem Is We’ve No Idea How They Work

    Does YouTube create extremists? It’s hard to argue that YouTube doesn’t play a role in radicalization, Chico Camargo writes. “In fact, maximizing watchtime is the whole point of YouTube’s algorithms, and this encourages video creators to fight for attention in any way possible.” Society must insist on using algorithm auditing, even though it is a difficult and costly process. “But it’s important, because the alternative is worse. If algorithms go unchecked and unregulated, we could see a gradual creep of conspiracy theorists and extremists into our media, and our attention controlled by whoever can produce the most profitable content.”

  • Chinese Communist Party’s Media Influence Expands Worldwide

    Over the past decade, Chinese Communist Party (CCP) leaders have overseen a dramatic expansion in the regime’s ability to shape media content and narratives about China around the world, affecting every region and multiple languages, according to a new report. This trend has accelerated since 2017, with the emergence of new and more brazen tactics by Chinese diplomats, state-owned news outlets, and CCP proxies.

  • Combating the Latest Technological Threat to Democracy: A Comparison of Facebook and Twitter’s Deepfake Policies

    Twitter and Facebook have both recently announced policies for handling synthetic and manipulated media content on their platforms. Side-by-side comparison and analysis of Twitter and Facebook’s policies highlights that Facebook focuses on a narrow, technical type of manipulation, while Twitter’s approach contemplates the broader context and impact of manipulated media.