• Declining trust in facts, institutions imposes real costs on U.S. society

    Americans’ reliance on facts to discuss public issues has declined significantly in the past two decades, leading to political paralysis and collapse of civil discourse, according to a RAND report. This phenomenon, referred to as “Truth Decay,” is defined by increasing disagreement about facts, a blurring between opinion and fact, an increase in the relative volume of opinion and personal experience over fact, and declining trust in formerly respected sources of factual information.

  • Responding to Truth Decay: Q&A with RAND’s Michael Rich and Jennifer Kavanagh

    Winston Churchill is reported to have said, “A lie gets halfway around the world before the truth can get its pants on.” Experts say it is worse now. With social media, false or misleading information is disseminated all over the world nearly instantaneously. Another thing that’s new about Truth Decay is the confluence of factors that are interacting in ways we do not fully understand yet. It is not clear that key drivers like our cognitive biases, polarization, changes in the information space, and the education system’s struggle to respond to this sort of challenge have ever coincided at such intensive and extreme levels as they do now. Russian disinformation and hacking campaigns against the United States and other Western democracies are the most obvious examples of the amplification – and exploitation – of Truth Decay. Garry Kasparov, the chess master and Russian dissident, said about Russian disinformation efforts: “The point of modern propaganda isn’t only to misinform or push an agenda. It is to exhaust your critical thinking … to annihilate truth.”

  • Sputnik partner “required to register” under U.S. Foreign-Agent law

    State-supported Russian media outlet Sputnik says its U.S.-based partner company RIA Global LLC has been ordered to register as a foreign agent by the U.S. government. According to Sputnik, the Justice Department said that RIA Global produces content on behalf of state media company Rossiya Segodnya and thus on behalf of the Russian government.

  • Russian influence in Mexican and Colombian elections

    Russia’s ongoing effort to destroy faith in democracy is not only a problem for the United States and Europe. The Kremlin has set its sights on destabilizing next year’s Mexican and Colombian elections, and has been strengthening its instruments of political influence in both countries. In 2015, White House Chief of Staff John Kelly, then in his capacity as Commander of U.S. Southern Command, warned that under President Vladimir Putin, Russia is “using power projection in an attempt to erode U.S. leadership and challenge U.S. influence in the Western Hemisphere.”

  • Twitter, citizen science, and AI help improve flood data collection

    Urban flooding is difficult to monitor due to complexities in data collection and processing. This prevents detailed risk analysis, flooding control, and the validation of numerical models. Researchers are combining Twitter, citizen science and cutting-edge artificial intelligence (AI) techniques to develop an early-warning system for flood-prone communities.

  • Lawmakers from states targeted by Russian hackers urge action to protect U.S. elections

    Democracy Reform Task Force Chair Rep. John Sarbanes (D-Maryland) the other day, along with members of Congress from 18 of the 21 states targeted by Russian hackers in 2016, called on House Speaker Paul Ryan to take immediate action to protect state voting systems from cyberattacks and to bolster state election infrastructure.

  • The influence and risk of social and political "bots"

    The role and risks of bots, such as automated Twitter accounts, in influencing public opinion and political elections continues to provoke intense international debate and controversy. A new collection of articles focused on “Computational Propaganda and Political Big Data” examines how these bots work, approaches to better detect and control them, and how they may have impacted recent elections around the globe. The collection is published in a special issue of Big Data.

  • Spotting Russian bots trying to influence politics

    A team of researchers has isolated the characteristics of bots on Twitter through an examination of bot activity related to Russian political discussions. The team’s findings provide new insights into how Russian accounts influence online exchanges using bots, or automated social media accounts, and trolls, which aim to provoke or disrupt. “There is a great deal of interest in understanding how regimes and political actors use bots in order to influence politics,” explains one researcher. “Russia has been at the forefront of trying to shape the online conversation using tools like bots and trolls, so a first step to understanding what Russian bots are doing is to be able to identify them.”

  • Social media trends can predict vaccine scares tipping points

    Analyzing trends on Twitter and Google can help predict vaccine scares that can lead to disease outbreaks, according to a new study. Researchers examined Google searches and geocoded tweets with the help of artificial intelligence and a mathematical model. The resulting data enabled them to analyze public perceptions on the value of getting vaccinated and determine when a population was getting close to a tipping point.

  • Why the president’s anti-Muslim tweets could increase tensions

    Last week, President Trump retweeted to his nearly 44 million followers a series of videos purporting to show Muslims assaulting people and destroying Christian statues. These videos, originally shared by an extremist anti-Muslim group in the U.K., were shown to be inaccurate and misleading. Our research may shed light on why President Trump shared anti-Muslim videos with his followers. As the White House press secretary said, his decision was a direct response to a perceived threat posed by Muslims. However, religious threat is not a one-way street. Attacking Muslims is not likely to stop religious conflict, but instead increase religious tension by fostering a spiraling tit-for-tat of religious threat and prejudice that increases in severity over time. This type of cyclical process has long been documented by scholars. If people who feel discriminated against because of their religion retaliate by discriminating against other religions, religious intolerance is only going to snowball. If President Trump really wants to stop religious violence, social psychology suggests he should refrain from it himself.

  • Russian-operated bots posted millions of social media posts, fake stories during Brexit referendum

    More than 156,000 Twitter accounts, operated by Russian government disinformation specialists, posted nearly 45,000 messages in support of the “Leave” campaign, urging British voters to vote for Brexit – that is, for Britain to leave the European Union. Researchers compared 28.6 million Russian tweets in support of Brexit to ~181.6 million Russian tweets in support of the Trump campaign, and found close similarity in tone and tactics in the Russian government’s U.K. and U.S. efforts. In both cases, the Russian accounts posted divisive, polarizing messages and fake stories aiming to raise fears about Muslims and immigrants. The goal was to sow discord; intensify rancor and animosity along racial, ethnic, and religious lines; and deepen political polarization — not only to help create a public climate more receptive to the populist, protectionist, nationalist, and anti-Muslim thrust of both Brexit and the Trump campaigns, but also to deepen societal and cultural fault lines and fractures in the United Kingdom and the United States, thus contributing to the weakening of both societies from within.

  • Anatomy of a fake news scandal

    On 1 December 2016, Alex Jones, the Info-Wars host, a conspiracy-theories peddler, and a fervent Trump booster, was reporting that Hillary Clinton was sexually abusing children in satanic rituals in the basement of a Washington, D.C., pizza restaurant. How was this fake story fabricated and disseminated? “We found ordinary people, online activists, bots, foreign agents and domestic political operatives,” Reveal’s researchers say. “Many of them were associates of the Trump campaign. Others had ties with Russia. Working together – though often unwittingly – they flourished in a new ‘post-truth’ information ecosystem, a space where false claims are defended as absolute facts. What’s different about Pizzagate, says Samuel Woolley, a leading expert in computational propaganda, is it was ‘retweeted and picked up by some of the most powerful faces of American politics’.”

  • During crisis, exposure to social media’s conflicting information is linked to stress

    Exposure to high rates of conflicting information during an emergency is linked to increased levels of stress, and those who rely on text messages or social media reports from unofficial sources are more frequently exposed to rumors and experience greater distress, according to new research.

  • App-based citizen science experiment to help predict future pandemics

    There are flu outbreaks every year, but in the last 100 years, there have been four pandemics of a particularly deadly flu, including the Spanish Influenza outbreak which hit in 1918, killing up to 100 million people worldwide. Nearly a century later, a catastrophic flu pandemic still tops the U.K. government’s Risk Register of threats to the United Kingdom. A new app gives U.K. residents the chance to get involved in an ambitious science experiment that could save lives.

  • BullyBlocker app tackles SU cyberbullying

    Researchers say that more than half of adolescents have been bullied online. Faculty and students at ASU’s New College of Interdisciplinary Arts and Sciences last month announced the public availability of BullyBlocker, a smartphone application that allows parents and victims of cyberbullying to monitor, predict and hopefully prevent incidents of online bullying.