• Who believes in conspiracies? Research offers a theory

    The Apollo moon landing was staged. The CIA killed JFK. 9/11 was a plot by the U.S. government to justify a war in the Middle East. President Barack Obama was not a natural born citizen. The massacre at Sandy Hook elementary school was staged as a pretense for increased gun control. The “deep state” is trying to destroy Donald Trump’s presidency. Conspiracy theories have been cooked up throughout history, but they are increasingly visible lately, likely due in part to the president of the United States routinely embracing or creating them. What draws people to conspiracy theories? New research suggests that people with certain personality traits and cognitive styles are more likely to believe in conspiracy theories.

  • Facing up to truth decay

    “Everyone is entitled to his own opinion, but not to his own facts.” That sentiment, once expressed by Sen. Daniel Patrick Moynihan, seems to be falling out of fashion in America’s current civil discourse. RAND Corporation’s Michael Rich has dubbed this phenomenon “Truth Decay,” and it is the subject of ongoing research designed to explore what is eroding the public’s trust in facts and institutions—and how to stop the trend.

  • Disinformation and fake news on Twitter

    The Knight Foundation has just released a new report — Disinformation, ‘Fake News’, and Influence Campaigns on Twitter – which, among other disturbing findings, shows that despite government efforts taken against those responsible for the misinformation campaigns during the 2016 election, 80 percent of these accounts are still active and still tweeting. Together, they produce about 1 million tweets per day. The study also found that 60 percent of these accounts have evidence they are partially run by bots, and many of the bot-run accounts appear to be connected.

  • The Road to Power: Idaho outfit behind rash of racist, anti-Semitic robocalls

    The Road to Power, a white supremacist and anti-Semitic broadcasting outlet based in Sandpoint, Idaho, continues to ramp up its tactic of robocalling communities nationwide with racist, anti-Semitic and bigoted language. The calls, which have targeted communities in California, Idaho, Iowa, Florida and Pennsylvania, seek to exploit current events by disseminating vile, offensive commentary. 

  • Amnesty International toils to tell real videos from fakes

    Increasingly sophisticated artificial-intelligence video tools, like FakeApp, are raising concerns by helping the technically astute create realistic computer-generated videos known as “deepfakes.” A deepfake video can put a person’s face on somebody else’s body, make them say words they never uttered, show them in a place they’ve never been, or even put them at an event that never occurred.

  • How to fight information manipulations: 50 recommendations

    French government think tanks have issued 50 recommendations to combat “information manipulations.” The recommendations are part of an exhaustive new study published by the Center for Analysis, Planning and Strategy (CAPS) — attached to the ministry of foreign affairs — and the Institute for Strategic Research of the Military School (IRSEM) — attached to the ministry of the armed forces. It warns that information manipulation, defined as “the intentional and massive distribution of false or biased news for hostile political purposes,” aims to “undermine the foundations of our democracy” and thereby constitute a threat to national security.

     

  • Identifying extremists online even before they post dangerous content

    The number and size of online extremist groups using social networks to harass users, recruit new members, and incite violence is rapidly increasing. New research has found a way to identify extremists, such as those associated with the terrorist group ISIS, by monitoring their social media accounts, and can identify them even before they post threatening content.

  • Broadcasting the reactionary right on YouTube

    A new report presents data from approximately 65 political influencers across 81 channels to identify the “Alternative Influence Network (AIN)”; an alternative media system that adopts the techniques of brand influencers to build audiences and “sell” them political ideology.

  • Beyond deep fakes: Automatically transforming video content into another video's style

    Researchers have created a method that automatically transforms the content of one video into the style of another. For instance, Barack Obama’s style can be transformed into Donald Trump. Because the data-driven method does not require human intervention, it can rapidly transform large amounts of video, making it a boon to movie production, as well as to the conversion of black-and-white films to color and to the creation of content for virtual reality experiences.

  • Facebook’s war on fake news is gaining ground

    In the two years since fake news on the Internet became a full-blown crisis, Facebook has taken numerous steps to curb the flow of misinformation on its site. Under intense political pressure, it’s had to put up a fight: At the peak in late 2016, Facebook users shared, liked, or commented on an estimated 200 million false stories in a single month. A new study is shedding light on a key question: Are Facebook’s countermeasures making a difference?

  • Twitter, Facebook face senators again

    The Senate Intelligence Committee is set to hear from two top social media executives today (Wednesday) on what they have been doing to combat the spread of propaganda and disinformation online and how they are prepared to help secure the integrity of upcoming elections. The committee will hear from Twitter Co-Founder and CEO Jack Dorsey and Facebook COO Sheryl Sandberg – but one chair, reserved for Google cofounder Larry Page, may remain empty. The committee extended the invitation to Google CEO Sundar Pichai as well as Larry Page, who is CEO of Google’s parent company, Alphabet, but the company wanted to send senior vice president Kent Walker instead. The committee made it clear it is not interested in hearing from Walker.

  • Confused about who to believe when information clashes? Our research may help

    By Eva M Krockow, Andrew M Colman, and Briony Pulford

    “Just remember, what you are seeing and what you are reading is not what’s happening,” Donald Trump, the president of the United States, once said at a rally. There is no doubt that we have entered a new age of bewilderment in which it is harder than ever before to decide where the truth lies. The rise of social media has generated a cacophony of contradictory information, mixed in with fake news on an industrial scale. Despite this, many people actually consider social media to be more credible and honest than mainstream media.

  • Bots, Russian trolls influenced vaccine discussion on Twitter

    Social media bots and Russian trolls promoted discord and spread false information about vaccines on Twitter, according to new research. Using tactics similar to those at work during the 2016 United States presidential election, these Twitter accounts entered into vaccine debates months before election season was underway.

  • Fake social media followers may derail the booming influencer marketing business

    Celebrities, social media stars, and other online personalities have taken a hit to their credibility in recent months, as millions of their followers have been exposed as fake or bought. This has created a bigger problem for advertisers and consumers, who no longer can trust in high follower numbers as a measure of influence and credibility.

  • Fake news is not just bad news: It is bad for the bottom line, too

    Note to Mark Zuckerberg: Beware of misinformation. Research makes a case that misinformation is a business risk for social media platforms, and proposes informational methods to alleviate the phenomenon of “fake news.” The research also suggests that Facebook users who help expose falsehoods should be compensated.