• Business & disastersCorporations can benefit from altruism during a crisis

    Altruism – and social media – can help corporations cultivate trust with consumers on mobile devices during and after natural disasters, such as hurricanes. “Companies that engage in corporate social responsibility efforts during and after a disaster can build strong relationships with consumers,” says one researcher. “This is particularly true if companies are communicating their efforts through social media aimed at mobile device users – but only if their efforts appear altruistic.”

  • Truth decayFake news “vaccine”: online game may “inoculate” by simulating propaganda tactics

    A new experiment, just launched online, aims to help “inoculate” against disinformation by providing a small dose of perspective from a “fake news tycoon.” The game encourages players to stoke anger, mistrust and fear in the public by manipulating digital news and social media within the simulation. Players build audiences for their fake news sites by publishing polarizing falsehoods, deploying twitter bots, photo-shopping evidence, and inciting conspiracy theories in the wake of public tragedy – all while maintaining a “credibility score” to remain as persuasive as possible. The psychological theory behind the research is called “inoculation”: “A biological vaccine administers a small dose of the disease to build immunity. Similarly, inoculation theory suggests that exposure to a weak or demystified version of an argument makes it easier to refute when confronted with more persuasive claims,” says a researcher.

  • Considered opinionSocial media is helping Putin kill our democracy

    By John R. Schindler

    There are few more important issues confronting the West today than what to do about social media companies, which thanks to their ubiquity possess vast riches and daunting influence over our democracies. The Russians have been spreading lies for decades. Active Measures, including fake reports, forged documents, and dastardly conspiracies invented out of thin air, were created by the KGB to smear Western governments. Social media made Moscow’s clandestine work much easier and more profitable. Although the lies currently emanating from the Kremlin resemble Cold War Active Measures in overall form and content, they are now disseminated so quickly, and through so many fronts, trolls, and bots, that Western governments are severely challenged to even keep up with these weaponized lies, much less push back. For this, we have the Internet to thank. While none can deny the countless benefits of the online age, this is one of its most pernicious side effects. It’s time the West seriously addressed the problem, and quickly, since this Kremlin spy game isn’t going away.

  • Online hateUsing AI, machine learning to understand extent of online hate

    The Anti-Defamation League’s (ADL) Center for Technology and Society (CTS) announced preliminary results from an innovative project that uses artificial intelligence, machine learning, and social science to study what is and what isn’t hate speech online. The project’s goal is to help the tech industry better understand the growing amount of hate online. CTS has collaborated with the University of California at Berkeley’s D-Lab since April 2017 to develop the Online Hate Index. ADL and the D-Lab have created an algorithm that has begun to learn the difference between hate speech and non-hate speech. The project has completed its first phase and its early findings are described in a report released today. In a very promising finding, ADL and the D-Lab found the learning model identified hate speech reliably between 78 percent and 85 percent of the time.

  • Truth decayTrump supporters, extreme right “share widest range of junk news”: Study

    A network of Donald Trump supporters shares the widest range of “junk news” on Twitter, and a network of extreme far-right conservatives on Facebook, according to analysis by Oxford University. The Oxford researchers find that on Twitter, a network of Trump supporters shares the widest range of junk news and circulates more junk news than all other political audience groups combined. On Facebook, extreme hard right pages – distinct from Republican pages – both share the widest range and circulate the largest volume of junk news compared with all the other audiences. Specifically, a group of “hard conservatives” circulates the widest range of junk news and accounts for the majority of junk news traffic in the sample. Junk news sources are defined as deliberately publishing misleading, deceptive, or incorrect information purporting to be real news about politics, economics, or culture. This type of content can include various forms of extremist, sensationalist, and conspiratorial material, as well as masked commentary and fake news.

  • The Russia connectionBritish government’s new “anti-fake news” unit has been tried before – and it got out of hand

    By Dan Lomas

    The decision to set up a new National Security Communications Unit to counter the growth of “fake news” is not the first time the UK government has devoted resources to exploit the defensive and offensive capabilities of information. A similar thing was tried in the Cold War era, with mixed results. Details of the new anti-fake news unit are vague, but may mark a return to Britain’s Cold War past and the work of the Foreign Office’s Information Research Department (IRD), which was set up in 1948 to counter Soviet propaganda. This secretive government body worked with politicians, journalists, and foreign governments to counter Soviet lies, through un-attributable “grey” propaganda and confidential briefings on “Communist themes.” IRD eventually expanded from this narrow anti-Soviet remit to protect British interests where they were likely “to be the object of hostile threats.” IRD’s rapid expansion from anti-communist unit to protecting Britain’s interests across the globe also shows that it’s hard to manage information campaigns. Moreover, government penny pinching on defense – a key issue in current debates – could also fail to match the resources at the disposal of the Russian state. In short, the lessons of IRD show that information work is not a quick fix. The British government could learn a lot by visiting the past.

  • Truth decayDeclining trust in facts, institutions imposes real costs on U.S. society

    Americans’ reliance on facts to discuss public issues has declined significantly in the past two decades, leading to political paralysis and collapse of civil discourse, according to a RAND report. This phenomenon, referred to as “Truth Decay,” is defined by increasing disagreement about facts, a blurring between opinion and fact, an increase in the relative volume of opinion and personal experience over fact, and declining trust in formerly respected sources of factual information.

  • Truth decayResponding to Truth Decay: Q&A with RAND’s Michael Rich and Jennifer Kavanagh

    Winston Churchill is reported to have said, “A lie gets halfway around the world before the truth can get its pants on.” Experts say it is worse now. With social media, false or misleading information is disseminated all over the world nearly instantaneously. Another thing that’s new about Truth Decay is the confluence of factors that are interacting in ways we do not fully understand yet. It is not clear that key drivers like our cognitive biases, polarization, changes in the information space, and the education system’s struggle to respond to this sort of challenge have ever coincided at such intensive and extreme levels as they do now. Russian disinformation and hacking campaigns against the United States and other Western democracies are the most obvious examples of the amplification – and exploitation – of Truth Decay. Garry Kasparov, the chess master and Russian dissident, said about Russian disinformation efforts: “The point of modern propaganda isn’t only to misinform or push an agenda. It is to exhaust your critical thinking … to annihilate truth.”

  • The Russia connectionSputnik partner “required to register” under U.S. Foreign-Agent law

    State-supported Russian media outlet Sputnik says its U.S.-based partner company RIA Global LLC has been ordered to register as a foreign agent by the U.S. government. According to Sputnik, the Justice Department said that RIA Global produces content on behalf of state media company Rossiya Segodnya and thus on behalf of the Russian government.

  • The Russia connectionRussian influence in Mexican and Colombian elections

    By David Salvo

    Russia’s ongoing effort to destroy faith in democracy is not only a problem for the United States and Europe. The Kremlin has set its sights on destabilizing next year’s Mexican and Colombian elections, and has been strengthening its instruments of political influence in both countries. In 2015, White House Chief of Staff John Kelly, then in his capacity as Commander of U.S. Southern Command, warned that under President Vladimir Putin, Russia is “using power projection in an attempt to erode U.S. leadership and challenge U.S. influence in the Western Hemisphere.”

  • FloodsTwitter, citizen science, and AI help improve flood data collection

    Urban flooding is difficult to monitor due to complexities in data collection and processing. This prevents detailed risk analysis, flooding control, and the validation of numerical models. Researchers are combining Twitter, citizen science and cutting-edge artificial intelligence (AI) techniques to develop an early-warning system for flood-prone communities.

  • The Russia connectionLawmakers from states targeted by Russian hackers urge action to protect U.S. elections

    Democracy Reform Task Force Chair Rep. John Sarbanes (D-Maryland) the other day, along with members of Congress from 18 of the 21 states targeted by Russian hackers in 2016, called on House Speaker Paul Ryan to take immediate action to protect state voting systems from cyberattacks and to bolster state election infrastructure.

  • Social and political botsThe influence and risk of social and political "bots"

    The role and risks of bots, such as automated Twitter accounts, in influencing public opinion and political elections continues to provoke intense international debate and controversy. A new collection of articles focused on “Computational Propaganda and Political Big Data” examines how these bots work, approaches to better detect and control them, and how they may have impacted recent elections around the globe. The collection is published in a special issue of Big Data.

  • The Russia connectionSpotting Russian bots trying to influence politics

    A team of researchers has isolated the characteristics of bots on Twitter through an examination of bot activity related to Russian political discussions. The team’s findings provide new insights into how Russian accounts influence online exchanges using bots, or automated social media accounts, and trolls, which aim to provoke or disrupt. “There is a great deal of interest in understanding how regimes and political actors use bots in order to influence politics,” explains one researcher. “Russia has been at the forefront of trying to shape the online conversation using tools like bots and trolls, so a first step to understanding what Russian bots are doing is to be able to identify them.”

  • VaccinationSocial media trends can predict vaccine scares tipping points

    Analyzing trends on Twitter and Google can help predict vaccine scares that can lead to disease outbreaks, according to a new study. Researchers examined Google searches and geocoded tweets with the help of artificial intelligence and a mathematical model. The resulting data enabled them to analyze public perceptions on the value of getting vaccinated and determine when a population was getting close to a tipping point.