• Separating factual from fake messages during a crisis

    How well can you tell facts from fake on social media? How about in a crisis? DHS S&T, together with Canadian partners, concluded the fifth Canada-U.S. Enhanced Resiliency Experiment (CAUSE V) event last year, running drills involving the hypothetical eruption of Mt. Baker, an active volcano in the Pacific Northwest. As part of the simulation, a group of digital disaster services volunteers practiced separating fact from fiction on the web, with the mission of keeping responders informed during the event.

  • Propagating online conspiracies

    Due to the Internet, conspiracy theories are on the rise and playing an increasingly significant role in global politics. Now new research has analyzed digital data to reveal exactly who is propagating them and why. The researchers said that conspiracies such as Pizzagate (which falsely claimed high-ranking Democratic Party officials were running a child-sex ring out of a pizza shop) and the anti-vaccination movement are becoming a bigger issue.

  • Why you stink at fact-checking

    People are very bad at picking up on factual errors in the world around them. Research from cognitive psychology shows that people are naturally poor fact-checkers and it is very difficult for us to compare things we read or hear to what we already know about a topic. In what’s been called an era of “fake news,” this reality has important implications for how people consume journalism, social media and other public information.

  • Privacy of Americans not protected in omnibus spending bill

    The CLOUD Act, inserted at the very end of the 2,232-page omnibus spending bill, will make substantial amendments to the Electronic Communications Privacy Act (ECPA). It grants U.S. law enforcement entities new powers to compel U.S. companies to disclose communications and data on U.S. and foreign users that is stored overseas. It also empowers foreign governments to demand the stored and real-time data and communications of users outside the U.S.

  • Why junk news spreads so quickly across social media

    Why and how has the rise of social media contributed to the spread of what we at the Oxford University’s Computational Propaganda Project call “junk news” — the tabloidization, false content, conspiracy theories, political propaganda we have become all too familiar with? Three reasons: Algorithms, advertising. and exposure in public life.

  • Cambridge Analytica: the data analytics industry is already in full swing

    Revelations about Cambridge Analytica have laid bare the seeming lack of control that we have over our own data. Suddenly, with all the talk of “psychographics” and voter manipulation, the power of data analytics has become the source of some concern. But the risk is that if we look at the case of Cambridge Analytica in isolation, we might prevent a much wider debate about the use and control of our data. By focusing on the reports of extreme practices, we might miss the many everyday ways that data analytics are now shaping our lives.

  • Cambridge Analytica’s abuse of Facebook user data shows “profound impact of technology on democracy”

    Facebook has suspended Cambridge Analytica from its platform for violating its guidelines on the use of user data. The Center for Democracy and Technology (CDT) says that a weekend New York Times article further illuminated the scale of Cambridge Analytica’s efforts and showed how the company used personal information about users to conduct targeted political outreach. “These revelations illustrate the profound impact internet platforms can have on democracy,” CDT says.

  • New U.S. sanctions on Russia for election interference, infrastructure cyberattacks, NoPetya

    The U.S. Treasury has issued its strongest sanctions yet against Russia in response to what it called “ongoing nefarious attacks.” The move targets five entities and nineteen individuals. Among the institutions targeted in the new sanctions for election meddling were Russia’s top intelligence services, Federal Security Service (FSB) and Main Intelligence Directorate (GRU), the two organizations whose hackers, disinformation specialists, and outside contractors such as the Internet Research Agency (IRA) troll farm were behind — and are still engaged in — a broad and sustained campaign to undermine U.S. democracy.

  • To stop fake news, internet platforms should choose quality over quantity: Study

    “Fake news” has made headlines and dominated social media chatter since the 2016 presidential election. It appears to be everywhere, and researchers are still determining the scale of the problem. A new study examines fake news and its prevalence and impact across Google, Facebook, and Twitter. The authors offer recommendations for stemming the flow and influence of fake news, and in particular call for more interdisciplinary research—including more collaboration between internet platforms and academia — “to reduce the spread of fake news and to address the underlying pathologies it has revealed.”

  • Study: On Twitter, false news travels faster than true stories

    A new study by three MIT scholars has found that false news spreads more rapidly on the social network Twitter than real news does — and by a substantial margin. “We found that falsehood diffuses significantly farther, faster, deeper, and more broadly than the truth, in all categories of information, and in many cases by an order of magnitude,” says one researcher. “These findings shed new light on fundamental aspects of our online communication ecosystem,” says another researcher, adding that the researchers were “somewhere between surprised and stunned” at the different trajectories of true and false news on Twitter.

  • Large-scale scientific investigation required to combat fake news: Researcher

    The indictment of 13 Russians in the operation of a troll farm that spread false information related to the 2016 U.S. presidential election has renewed the spotlight on the power of fake news to influence public opinion. Researchers call for a coordinated investigation into the underlying social, psychological and technological forces behind fake news. This is necessary to counteract the phenomenon’s negative influence on society, the researchers say.

  • Extremists exploit gun control debate to promote hatred of Jews

    White supremacists are attempting to exploit the tragic mass shooting in Parkland, Florida, and the ensuing debate over gun control to push an anti-Semitic agenda. Many of these white supremacists are publicly framing the battle over gun control as a struggle between beleaguered whites who want to preserve their traditions in the face of a Jewish onslaught. The ADL says that white supremacists’ anti-Semitic attacks intensified in the wake of NRA head Wayne LaPierre’s 22 February speech to CPAC. LaPierre, perhaps unknowingly, used terms which are buzzwords white supremacists associate with Jews, such as “European-style socialists.” LaPierre said, “A tidal wave of new European-style socialists [has seized] control of the Democratic party.” The only people LaPierre mentioned as examples of people using “social engineering” to try to take away the guns and freedoms of Americans were two Jewish businessmen, Michael Bloomberg and George Soros.

  • Atomwaffen, extremist group whose members have been charged in five murders, loses some of its platforms

    At least four technology companies have taken steps to bar Atomwaffen Division, a violent neo-Nazi organization, from using their online services and platforms to spread its message or fund its operations. The action comes after ProPublica reports detailing the organization’s terrorist ambitions and revealing that the California man charged with murdering Blaze Bernstein, a 19-year-old college student found buried in an Orange County park earlier this year, was an Atomwaffen member.

  • Russia used social media extensively to influence U.S. energy markets: Congressional panel

    The U.S. House Science, Space, and Technology Committee last week released a staff report uncovering Russia’s extensive efforts to influence U.S. energy markets through divisive and inflammatory posts on social media platforms. The report details Russia’s motives in interfering with U.S. energy markets and influencing domestic energy policy and its manipulation of Americans via social media propaganda. The report includes examples of Russian-propagated social media posts.

  • New challenge for first responders: Fake News

    First responders must find ways to address a new challenge: Not only do they have to deal with floods, storms, fires, earthquakes, active shooter events, and other natural and manmade crises – now they also have to find ways to deal with fake news. Social media may disseminate valuable and helpful information during disasters and extreme events – but it may also be used to spread fake news: disinformation and misinformation about the scope, nature, and sources, and location of a disaster or extreme incident. Such misinformation may not only confuse victims and potential victims, but also confuse and mislead first responders who rush to their rescue.