• I Researched Uighur Society in China for 8 Years and Watched How Technology Opened New Opportunities – Then Became a Trap

    The Uighurs, a Muslim minority ethnic group of around 12 million in northwest China, are required by the police to carry their smartphones and IDs listing their ethnicity. As they pass through one of the thousands of newly built digital media and face surveillance checkpoints located at jurisdictional boundaries, entrances to religious spaces and transportation hubs, the image on their ID is matched to their face. If they try to pass without these items, a digital device scanner alerts the police. The Chinese state authorities described the intrusive surveillance as a necessary tool against the “extremification” of the Uighur population. Through this surveillance process, around 1.5 million Uighurs and other Muslims were determined “untrustworthy” and have forcibly been sent to detention and reeducation in a massive internment camp system. Since more than 10 percent of the adult population has been removed to these camps, hundreds of thousands of children have been separated from their parents. Many children throughout the region are now held in boarding schools or orphanages which are run by non-Muslim state workers.

  • Determining the Who, Why, and How Behind Manipulated Media

    The threat of manipulated multi-modal media – which includes audio, images, video, and text – is increasing as automated manipulation technologies become more accessible, and social media continues to provide a ripe environment for viral content sharing. The creators of convincing media manipulations are no longer limited to groups with significant resources and expertise. Today, an individual content creator has access to capabilities that could enable the development of an altered media asset that creates a believable, but falsified, interaction or scene. A new program seeks to develop technologies capable of automating the detection, attribution, and characterization of falsified media assets.

  • AI Startups to Fight Against Online Disinformation

    On both sides of the Atlantic, governments, foundations, and companies are looking at how to solve the problem of online dis/misinformation. Some emphasize the demand side of the problem, believing it important to focus on consumer behavior and the use of media literacy and fact-checking. Some focus on legal remedies such as platform-liability and hate-speech laws as well as privacy protections. Others try to raise the quality of journalism in the hope that creating more reliable content. There is another kind of fix, offered by small companies in the information ecosystem: Using natural language processing as well as human intelligence to identify and, in some cases, block false or inflammatory content online.

  • How to Act against Domestic Terrorists — and Their Foreign Supporters

    The United States faces a surging domestic terrorism threat in the homeland. In the aftermath of the El Paso and Dayton shootings in the first weekend of August, more than 40 people were   arrested for threats to commit mass attacks by the end of that month. GW Program on Extremism suggests two ways to achieve a more effective and coordinated multisector response to the domestic terrorism threat. First, specific criminal statutes for domestic terrorism offenses need to be enacted that penalize the commission of specific violent crimes. Acknowledging concerns that new criminal statutes related to property damage may stifle legitimate protest, new criminal statutes could be limited to violence against persons and providing material support to terrorists. Second, the list of proscribed foreign terrorist organizations (FTOs) should include far-right actors outside of the United States.

  • How Disinformation Could Sway the 2020 Election

    In 2016, Russian operatives used Facebook, Twitter and YouTube to sow division among American voters and boost Donald Trump’s presidential campaign. What the Russians used to accomplish this is called “disinformation,” which is false or misleading content intended to deceive or promote discord. Now, with the first presidential primary vote only five months away, the public should be aware of the sources and types of online disinformation likely to surface during the 2020 election.

  • Foreign Interference Threat Bigger than Terrorism, Warns Spymaster

    Foreign interference and hostile state espionage are a bigger threat to Australia’s security than terrorism, one of the country’s top spy chiefs has warned. Duncan Lewis, the outgoing head of Australian Security Intelligence Organization (ASIO), identified three challenges security confronting Australians: terrorism, cyber warfare; and foreign interference and espionage. But the latter was on a “growth trajectory” and is a greater threat than terrorism, he told a Lowy Institute forum in Sydney.

  • Lega Nord’s Bedfellows: Russians Offering Illicit Funding to Italian Far-Right Party Identified

    In the last four years, the Kremlin has engaged in a broad, systematic campaign – consisting of hacking, a vast social media disinformation effort, and illicit funding – to weaken the West by helping far-right, populist, pro-Russian politicians and movements reach power. One of their successes was in Italy, where the far-right, anti-EU, anti-immigrant Northern League and the eclectic, anti-establishment 5 Star Movement won enough seats in the Fall 2017 election to form a coalition government (which collapsed last week, after more than a 1.5 years in power). Prosecutors in Milan have launched an investigation of The League after recordings emerged of meetings between League leaders and Kremlin emissaries, in which a scheme to secure funding for The League in the upcoming European parliament elections was discussed. The funding – in the millions of Euro – was to be funneled via artificially underpriced Russian oil export transactions.

  • The BBC Joins Up with Google, Facebook, and Twitter to Try to Tackle Misinformation Online

    The BBC is teaming up with some of the biggest names in tech to coordinate a defense against the online disinformation campaigns endemic to some of their platforms, the outlet announced Saturday. Google, Twitter, and Facebook said that they, and the BBC, would come up with a targeted approach which, in part, uses an early warning system during critical periods when the spread of misinformation “threatens human life or disrupts democracy during election,” per the BBC.

  • NOAA’s Chief Scientist Will Investigate Why Agency Backed Trump over Its Experts on Dorian, Email Shows

    NOAA acting chief scientist said he was investigating whether the agency’s response to President Trump’s incorrect claims about the risk Hurricane Dorian posed to Alabama constituted a violation of NOAA policies and ethics. Forecasters in the Birmingham, Alabama office of the National Weather Service (NWS) immediately corrected Trump’s initial (1 September) false claims, but five days later (6 September), two NOAA administrators issued an unsigned press release which appeared to lend support to Trump’s claims. “The NWS Forecaster(s) corrected any public misunderstanding in an expert and timely way, as they should,” the chief scientist, Craig McLean, wrote to NOAA employees. “There followed, last Friday, an unsigned press release from ‘NOAA’ that inappropriately and incorrectly contradicted the NWS forecaster. My understanding is that this intervention to contradict the forecaster was not based on science but on external factors including reputation and appearance, or simply put, political.” Scientists and experts in emergency response harshly criticized NOAA officials for bowing to political pressures and conceding to Trump’s false claims during a weather emergency, when accuracy, messaging, and trust in public safety agencies are vital to keep the public safe.

  • Britain Plans Mass Mobile Phone Alerts to Protect Public from Terrorism, Major Floods and Nuclear Attack

    Britain is planning to introduce US-style mass mobile phone alerts to protect the public against terrorism, major floods and nuclear attack. Supporters of so-called ‘cell broadcasting’ claim the message alerts could have saved lives during major incidents including the London Bridge terrorist attack and Grenfell Tower fire. Senior figures have raised concerns, however, that the messages could be hijacked by hackers or malicious foreign powers to induce mass panic.

  • Hostile Social Manipulation by Russia and China: A Growing, Poorly Understood Threat

    With the role of information warfare in global strategic competition becoming much more apparent, a new report delves into better defining and understanding the challenge facing the United States by focusing on the hostile social manipulation activities of the two leading users of such techniques: Russia and China.

  • The Truth About Conspiracy Theories

    Conspiracy theories have been around for hundreds of years, but with the rise of the internet, the speed with which they spread has accelerated and their power has grown. But do they work, who believes them, and why? What kind of damage can they do—and how can we do a better job of controlling that damage, as individuals and as a society? Tufts University Kelly M. Greenhill says that the answers are complicated—but with misinformation proliferating and mutating like a virus, and the health of civil society and democratic governance at stake, it’s crucial to try to address them and contain them.

  • Disinformation Is Catalyzing the Spread of Authoritarianism Worldwide

    There’s a segment of the American left that believes we’re in no position to be outraged over Russia’s multifaceted campaign to swing the 2016 election to Trump because the U.S. has meddled in its share of elections in other countries. Setting aside the fact that this is a prime example of the tu quoque fallacy, it ignores the specific context of that intervention. Joshua Holland writes in Raw Story that this is not about the U.S. alone. “As I wrote for The Nation in 2017, long before Trump descended on that gaudy golden escalator to announce his candidacy…, Russia had honed its tactics in Estonia, followed soon after by attempts, with varying degrees of success, to disrupt the domestic politics of Georgia, Kyrgyzstan, Kazakhstan, Finland, Bosnia and Macedonia.” It also isn’t about Russia. “As the New York Times reported earlier this year, researchers have ‘discovered numerous copycats, particularly on the far right. Those groups often echo Kremlin talking points, making it difficult to discern the lines between Russian propaganda, far-right disinformation and genuine political debate,’” Holland writes.

  • A College Reading List for the Post-Truth Era

    “We live in a time beset with belittlement of science, hostility toward expertise and attacks on traditional democratic institutions,” Michael T. Nietzel, president emeritus of Missouri State University, writes. “It’s a post-truth period where conspiracy theories and crackpot ideas flourish. If the facts conflict with someone’s sense of identity or political ideology, then the facts are disposable. They can be replaced with notions that feel better or reverberate on social media.” What is the best way to achieve the goal of making young students less susceptible to dangerous s stupidities and toxic conspiracy theories? Nietzel has a suggestion — although he admits it is increasingly rare as an academic expectation: serious reading. He offers seven recent books which champion reason over emotion, distinguish facts from fallacies, and enumerate the dangers of ignoring the truth.

  • Instagram's New Fact-Checking Tool May Have Limited Impact on Disinformation

    Researchers worry that a new feature giving Instagram users the power to flag false news on the platform won’t do much to head off efforts to use disinformation to sow political discord in 2020. The role of Instagram in spreading political disinformation took center stage in a pair of Senate reports in December, which highlighted how Russian state operatives used fake accounts on the platforms, masquerading as members of activist groups like Black Lives Matter during and well after the 2016 election.