• Minds, the “anti-Facebook,” has no idea what to do about all the neo-Nazis

    Minds is home to neo-Nazis, and wants its users to help decide what content stays on the site. Ben Makuch and Jordan Pearson write in Motherboard that Minds is a US-based social network that bills itself as being focused on transparency (its code is open source), free speech, and cryptocurrency rewards for users. Much of the recent media coverage around Minds, which launched in 2015, has focused on how it challenges social media giants and its adoption of cryptocurrency, while also noting that the site’s light-touch approach to content moderation has led to a proliferation of far-right viewpoints being shared openly on its platform.

  • Facebook’s dystopian definition of “fake”

    Every time another “fake video” makes the rounds, its menace gets rehashed without those discussing it establishing what “fakeness” means in the first place. The latest one came last week, a doctored video of Nancy Pelosi. President Donald Trump tweeted a reference to the video; his personal attorney Rudy Giuliani shared it, too, although Giuliani later deleted his post. Ian Bogost writes in The Atlantic that these sorts of events are insidious because it’s hard to form a response that isn’t a bad one. Talking about the video just gives its concocted message more oxygen. Ignoring it risks surrendering truth to the ignorant whims of tech companies. The problem is, a business like Facebook doesn’t believe in fakes. For it, a video is real so long as it’s content. And everything is content.

  • Unknowingly loading malicious content from “trusted” sites

    New research from CSIRO’s Data61, the data and digital specialist arm of Australia’s national science agency, questions the “trustability” of websites and in a world first quantifies the extent to which the trust model of today’s World Wide Web is fundamentally broken.

  • Doctored video of Nancy Pelosi shows social media giants ill-prepared for 2020

    Hours after House Speaker Nancy Pelosi addressed a conference Wednesday, a distorted video of the California Democrat’s conversation began spreading across the internet. The manipulated clip, slowed to make Pelosi sound as if she were slurring her words, racked up millions of views on Facebook the following day. It was posted to YouTube, and on Thursday night was given a boost on Twitter when Rudy Giuliani, President Trump’s personal lawyer and former mayor of New York, shared a link with his 318,000 followers. Sam Dean and Suhauna Hussain write in the Los Angeles Times that by Friday, the three social media giants were forced to respond to this viral instance of political fakery. How they dealt with the issue, three years after being blindsided by a wave of fake news and disinformation in the 2016 election cycle, may serve as a harbinger of what’s to come in 2020.

  • The many faces of foreign interference in European elections

    Citizens of the European Union’s 28 member states go to the polls this week to choose their representatives to the European Parliament. Following Russian interference in several high-profile elections over the past three years, European governments are on high alert for signs of such meddling on social media or in electoral IT systems. Recent events in Austria and Italy show that foreign authoritarian actors are finding other under-examined, but equally insidious ways to infiltrate campaigns and harm democracy in Europe.

  • The Kremlin’s “tools of malign political influence” undermine democracy

    Russia’s “sweeping and systematic malign influence operations” support anti-democratic and anti-Western forces in Europe and the United States, using a variety of tools, from corruption to influence operations, said Heather A. Conley, CSIS senior vice president for Europe, Eurasia, and the Arctic, and director of the Europe Program, in a testimony before the House Foreign Affairs Subcommittee on Europe, Eurasia, Energy, and the Environment, during hearings on “Undermining Democracy: Kremlin Tools of Malign Political Influence.” “The Kremlin undermines and weakens democracies, rendering them unable to respond promptly to Russian military actions or making them beholden to the Kremlin to such a point that a democratic country will support Russia’s interests over its own,” she testified. She highlighted two specific areas in which she is “particularly concerned U.S. citizens and organizations, wittingly or unwittingly, will come under increasing threat of Russian malign influence”: (1) faith-based and ultra conservative
    organizations; and (2) opaque financial support for key U.S. influencers.

  • Sprawling disinformation networks discovered across Europe ahead of EU elections

    Investigation uncovers flood of disinformation aiming to influence to forthcoming EU elections. The revelations led Facebook to take down pages with more than 500 million views. The mainly far-right disinformation pages which were shut down by Face book had three times the number of followers than the pages of more established right wing, populist, and anti-EU partiers such as Lega (Italy), Alternative für Deutschland (AfD) (Germany), VOX (Spain), Brexit Party (U.K.), Rassemblement National (France), and PiS (Poland).

  • Tweets reveal how ISIS still inspires low-level attacks

    By analyzing 26.2 million Twitter comments in the Arabic language, researchers found that despite losing territory, ISIS remains successful at inspiring low-level attacks because of its messaging for a “call for lone jihad.”

  • Cyber-enabled election interference occurs in one-fifth of democracies

    Cyber-enabled election interference has already changed the course of history. Fergus Hanson and Elise Thomas write in The Strategist that whether or not the Russian interference campaign during the US 2016 federal election was enough to swing the result, the discovery and investigation of the campaign and its negative effects on public trust in the democratic process have irrevocably shaped the path of Donald Trump’s presidency.

  • Hacking democracies

    A new report from an Australian think tank offers an in-depth, and sobering, analysis of Russia’s campaign to undermine Western democracies by weaponizing social media, and, to a lesser extent, China’s similar, if lower-key, campaign against neighboring Asian countries. “Democracies need to look at better ways of imposing costs on adversaries,” the report’s authors say.

  • Eric Oliver on the science of conspiracy theories and political polarization

    The “birthers,” “Pizzagate,” anti-vaxxers. It seems that belief in conspiracy theories is on the rise. At the same time, our polarization is worse than ever. People can hardly even maintain a conversation across political or cultural lines. Could the underlying force driving conspiracy theories also be the same one that’s dividing our country?

  • Facebook, Twitter and the digital disinformation mess

    The kind of disinformation now known as fake news has tainted public discourse for centuries, even millennia. But it’s been amplified in our digital age as a weapon of fearmongers, mob-baiters and election-meddlers that can widen social fissures, undermine democracies and bolster authoritarian regimes. Shelly Banjo writes in the Washington Post that as voters in some of the world’s most-populous countries headed to the polls in 2019, governments began to respond. Companies such as Facebook, Twitter and Google have come under increasing pressure to take action.

  • Google cuts Huawei access to Android software updates

    Google said on Sunday it was rescinding Huawei’s license to use Google’s mobile phone operating system Android, and Google services such as Google maps and YouTube. The move will force the Chinese technology company to rely on an open-source version of the software. The move follows a presidential executive order prohibiting American companies from using telecommunications equipment made by “foreign adversaries” viewed as posing a threat to U.S. national security.

  • Why the Christchurch call to remove online terror content triggers free speech concerns

    France and New Zealand spearheaded the adoption on May 15 of the Christchurch Call to Eliminate Terrorist & Violent Extremist Content Online, a voluntary pledge endorsed by 18 countries and many tech companies (including Microsoft, Google, Facebook and Twitter). The United States refused to join, citing tofree speech concerns. The Christchurch Call was named after the city in New Zealand where a horrific terrorist attack killed 51 people and injured 50 at two mosques in March. That massacre was live-streamed on Facebook, spreading quickly on that platform as well as other social media sites and raising concerns about how such content goes viral. Evelyn Aswad writes in Just Security that U.S. isolation amidst close allies with respect to this initiative has led to questions about what were the First Amendment hurdles that prevented the U.S. from joining this pledge, especially given it constitutes a political commitment rather than a legally binding document.

  • Bolstering cyber resilience

    In December 2015, the first known successful cyberattack on a power grid was carried out in Ukraine, disrupting the electricity supply for hundreds of thousands of customers for several hours. Since then, concerns have grown across the globe about the potential public health, economic and security impacts of widespread power outages in heavily populated regions. Argonne partners with World Economic Forum in important cyber resilience effort.