Social networks | Homeland Security Newswire

  • Using AI, machine learning to understand extent of online hate

    The Anti-Defamation League’s (ADL) Center for Technology and Society (CTS) announced preliminary results from an innovative project that uses artificial intelligence, machine learning, and social science to study what is and what isn’t hate speech online. The project’s goal is to help the tech industry better understand the growing amount of hate online. CTS has collaborated with the University of California at Berkeley’s D-Lab since April 2017 to develop the Online Hate Index. ADL and the D-Lab have created an algorithm that has begun to learn the difference between hate speech and non-hate speech. The project has completed its first phase and its early findings are described in a report released today. In a very promising finding, ADL and the D-Lab found the learning model identified hate speech reliably between 78 percent and 85 percent of the time.

  • Trump supporters, extreme right “share widest range of junk news”: Study

    A network of Donald Trump supporters shares the widest range of “junk news” on Twitter, and a network of extreme far-right conservatives on Facebook, according to analysis by Oxford University. The Oxford researchers find that on Twitter, a network of Trump supporters shares the widest range of junk news and circulates more junk news than all other political audience groups combined. On Facebook, extreme hard right pages – distinct from Republican pages – both share the widest range and circulate the largest volume of junk news compared with all the other audiences. Specifically, a group of “hard conservatives” circulates the widest range of junk news and accounts for the majority of junk news traffic in the sample. Junk news sources are defined as deliberately publishing misleading, deceptive, or incorrect information purporting to be real news about politics, economics, or culture. This type of content can include various forms of extremist, sensationalist, and conspiratorial material, as well as masked commentary and fake news.

  • British government’s new “anti-fake news” unit has been tried before – and it got out of hand

    The decision to set up a new National Security Communications Unit to counter the growth of “fake news” is not the first time the UK government has devoted resources to exploit the defensive and offensive capabilities of information. A similar thing was tried in the Cold War era, with mixed results. Details of the new anti-fake news unit are vague, but may mark a return to Britain’s Cold War past and the work of the Foreign Office’s Information Research Department (IRD), which was set up in 1948 to counter Soviet propaganda. This secretive government body worked with politicians, journalists, and foreign governments to counter Soviet lies, through un-attributable “grey” propaganda and confidential briefings on “Communist themes.” IRD eventually expanded from this narrow anti-Soviet remit to protect British interests where they were likely “to be the object of hostile threats.” IRD’s rapid expansion from anti-communist unit to protecting Britain’s interests across the globe also shows that it’s hard to manage information campaigns. Moreover, government penny pinching on defense – a key issue in current debates – could also fail to match the resources at the disposal of the Russian state. In short, the lessons of IRD show that information work is not a quick fix. The British government could learn a lot by visiting the past.

  • Declining trust in facts, institutions imposes real costs on U.S. society

    Americans’ reliance on facts to discuss public issues has declined significantly in the past two decades, leading to political paralysis and collapse of civil discourse, according to a RAND report. This phenomenon, referred to as “Truth Decay,” is defined by increasing disagreement about facts, a blurring between opinion and fact, an increase in the relative volume of opinion and personal experience over fact, and declining trust in formerly respected sources of factual information.

  • Responding to Truth Decay: Q&A with RAND’s Michael Rich and Jennifer Kavanagh

    Winston Churchill is reported to have said, “A lie gets halfway around the world before the truth can get its pants on.” Experts say it is worse now. With social media, false or misleading information is disseminated all over the world nearly instantaneously. Another thing that’s new about Truth Decay is the confluence of factors that are interacting in ways we do not fully understand yet. It is not clear that key drivers like our cognitive biases, polarization, changes in the information space, and the education system’s struggle to respond to this sort of challenge have ever coincided at such intensive and extreme levels as they do now. Russian disinformation and hacking campaigns against the United States and other Western democracies are the most obvious examples of the amplification – and exploitation – of Truth Decay. Garry Kasparov, the chess master and Russian dissident, said about Russian disinformation efforts: “The point of modern propaganda isn’t only to misinform or push an agenda. It is to exhaust your critical thinking … to annihilate truth.”

  • Sputnik partner “required to register” under U.S. Foreign-Agent law

    State-supported Russian media outlet Sputnik says its U.S.-based partner company RIA Global LLC has been ordered to register as a foreign agent by the U.S. government. According to Sputnik, the Justice Department said that RIA Global produces content on behalf of state media company Rossiya Segodnya and thus on behalf of the Russian government.

  • Russian influence in Mexican and Colombian elections

    Russia’s ongoing effort to destroy faith in democracy is not only a problem for the United States and Europe. The Kremlin has set its sights on destabilizing next year’s Mexican and Colombian elections, and has been strengthening its instruments of political influence in both countries. In 2015, White House Chief of Staff John Kelly, then in his capacity as Commander of U.S. Southern Command, warned that under President Vladimir Putin, Russia is “using power projection in an attempt to erode U.S. leadership and challenge U.S. influence in the Western Hemisphere.”

  • Twitter, citizen science, and AI help improve flood data collection

    Urban flooding is difficult to monitor due to complexities in data collection and processing. This prevents detailed risk analysis, flooding control, and the validation of numerical models. Researchers are combining Twitter, citizen science and cutting-edge artificial intelligence (AI) techniques to develop an early-warning system for flood-prone communities.

  • Lawmakers from states targeted by Russian hackers urge action to protect U.S. elections

    Democracy Reform Task Force Chair Rep. John Sarbanes (D-Maryland) the other day, along with members of Congress from 18 of the 21 states targeted by Russian hackers in 2016, called on House Speaker Paul Ryan to take immediate action to protect state voting systems from cyberattacks and to bolster state election infrastructure.

  • The influence and risk of social and political "bots"

    The role and risks of bots, such as automated Twitter accounts, in influencing public opinion and political elections continues to provoke intense international debate and controversy. A new collection of articles focused on “Computational Propaganda and Political Big Data” examines how these bots work, approaches to better detect and control them, and how they may have impacted recent elections around the globe. The collection is published in a special issue of Big Data.

  • Spotting Russian bots trying to influence politics

    A team of researchers has isolated the characteristics of bots on Twitter through an examination of bot activity related to Russian political discussions. The team’s findings provide new insights into how Russian accounts influence online exchanges using bots, or automated social media accounts, and trolls, which aim to provoke or disrupt. “There is a great deal of interest in understanding how regimes and political actors use bots in order to influence politics,” explains one researcher. “Russia has been at the forefront of trying to shape the online conversation using tools like bots and trolls, so a first step to understanding what Russian bots are doing is to be able to identify them.”

  • Social media trends can predict vaccine scares tipping points

    Analyzing trends on Twitter and Google can help predict vaccine scares that can lead to disease outbreaks, according to a new study. Researchers examined Google searches and geocoded tweets with the help of artificial intelligence and a mathematical model. The resulting data enabled them to analyze public perceptions on the value of getting vaccinated and determine when a population was getting close to a tipping point.

  • Why the president’s anti-Muslim tweets could increase tensions

    Last week, President Trump retweeted to his nearly 44 million followers a series of videos purporting to show Muslims assaulting people and destroying Christian statues. These videos, originally shared by an extremist anti-Muslim group in the U.K., were shown to be inaccurate and misleading. Our research may shed light on why President Trump shared anti-Muslim videos with his followers. As the White House press secretary said, his decision was a direct response to a perceived threat posed by Muslims. However, religious threat is not a one-way street. Attacking Muslims is not likely to stop religious conflict, but instead increase religious tension by fostering a spiraling tit-for-tat of religious threat and prejudice that increases in severity over time. This type of cyclical process has long been documented by scholars. If people who feel discriminated against because of their religion retaliate by discriminating against other religions, religious intolerance is only going to snowball. If President Trump really wants to stop religious violence, social psychology suggests he should refrain from it himself.

  • Russian-operated bots posted millions of social media posts, fake stories during Brexit referendum

    More than 156,000 Twitter accounts, operated by Russian government disinformation specialists, posted nearly 45,000 messages in support of the “Leave” campaign, urging British voters to vote for Brexit – that is, for Britain to leave the European Union. Researchers compared 28.6 million Russian tweets in support of Brexit to ~181.6 million Russian tweets in support of the Trump campaign, and found close similarity in tone and tactics in the Russian government’s U.K. and U.S. efforts. In both cases, the Russian accounts posted divisive, polarizing messages and fake stories aiming to raise fears about Muslims and immigrants. The goal was to sow discord; intensify rancor and animosity along racial, ethnic, and religious lines; and deepen political polarization — not only to help create a public climate more receptive to the populist, protectionist, nationalist, and anti-Muslim thrust of both Brexit and the Trump campaigns, but also to deepen societal and cultural fault lines and fractures in the United Kingdom and the United States, thus contributing to the weakening of both societies from within.

  • Anatomy of a fake news scandal

    On 1 December 2016, Alex Jones, the Info-Wars host, a conspiracy-theories peddler, and a fervent Trump booster, was reporting that Hillary Clinton was sexually abusing children in satanic rituals in the basement of a Washington, D.C., pizza restaurant. How was this fake story fabricated and disseminated? “We found ordinary people, online activists, bots, foreign agents and domestic political operatives,” Reveal’s researchers say. “Many of them were associates of the Trump campaign. Others had ties with Russia. Working together – though often unwittingly – they flourished in a new ‘post-truth’ information ecosystem, a space where false claims are defended as absolute facts. What’s different about Pizzagate, says Samuel Woolley, a leading expert in computational propaganda, is it was ‘retweeted and picked up by some of the most powerful faces of American politics’.”