• Truth decayDon’t be fooled by fake images and videos online

    By Hany Farid

    Advances in artificial intelligence have made it easier to create compelling and sophisticated fake images, videos and audio recordings. Meanwhile, misinformation proliferates on social media, and a polarized public may have become accustomed to being fed news that conforms to their worldview. All contribute to a climate in which it is increasingly more difficult to believe what you see and hear online. There are some things that you can do to protect yourself from falling for a hoax. As the author of the upcoming book Fake Photos, to be published in August, I’d like to offer a few tips to protect yourself from falling for a hoax.

  • Truth decayAre Russian trolls saving measles from extinction?

    By Ron Synovitz

    Scientific researchers say Russian social-media trolls who spread discord before the 2016 U.S. presidential election may also contributed to the 2018 outbreak of measles in Europe that killed 72 people and infected more than 82,000 — mostly in Eastern and Southeastern European countries known to have been targeted by Russia-based disinformation campaigns. Experts in the United States and Europe are now working on ways to gauge the impact that Russian troll and bot campaigns have had on the spread of the disease by distributing medical misinformation and raising public doubts about vaccinations.

  • Considered opinion: The Russia connectionRussia is attacking the U.S. system from within

    By Natasha Bertrand

    A new court filing submitted last Wednesday by Special Counsel Robert Mueller shows that a Russian troll farm currently locked in a legal battle over its alleged interference in the 2016 election appeared to wage yet another disinformation campaign late last year—this time targeting Mueller himself. Concord Management and Consulting is accused of funding the troll farm, known as the Internet Research Agency. But someone connected to Concord allegedly manipulated the documents and leaked them to reporters, hoping the documents would make people think that Mueller’s evidence against the troll farm and its owners was flimsy. Natasha Bertrand writes that “The tactic didn’t seem to convince anyone, but it appeared to mark yet another example of Russia exploiting the U.S. justice system to undercut its rivals abroad.”

  • Truth decayFake news detector algorithm works better than human screeners

    An algorithm-based system that identifies telltale linguistic cues in fake news stories could provide news aggregator and social media sites like Google News with a new weapon in the fight against misinformation. The researchers who developed the system have demonstrated that it’s comparable to and sometimes better than humans at correctly identifying fake news stories.

  • Truth decayPeering under the hood of fake-news detectors

    By Rob Matheson

    New work from MIT researchers peers under the hood of an automated fake-news detection system, revealing how machine-learning models catch subtle but consistent differences in the language of factual and false stories. The research also underscores how fake-news detectors should undergo more rigorous testing to be effective for real-world applications.

  • Truth decayWant to squelch fake news? Let the readers take charge

    By Peter Dizikes

    Would you like to rid the internet of false political news stories and misinformation? Then consider using — yes — crowdsourcing. That’s right. A new study co-authored by an MIT professor shows that crowdsourced judgments about the quality of news sources may effectively marginalize false news stories and other kinds of online misinformation.

  • The gathering stormRussia’s hostile measures threaten Europe: Report

    A new RAND report examines current Russian hostile measures in Europe and forecasts how Russia might threaten Europe using these measures over the next few years. “Whatever the U.S. response, preparation for involvement in a wide range of conflicts can help reduce the risk of mismanagement, miscalculation, and escalation,” the report’s authos say.

  • TerrorismKansas anti-Muslim bomb plotters sentenced to long prison terms

    Three members of a far-right militia, who were convicted of plotting to massacre Muslims in southwest Kansas immediately after the November 2016 election, were sentenced Friday to decades in prison. The terrorist plot was foiled after another militia member informed the police. Defense attorneys, in their sentencing memo, vigorously presented what came to be known as The Trump Defense: They argued that Trump’s anti-Muslim rhetoric during the 2016 election made attacks against Muslims appear legitimate. The defense attorneys also argued that the plot architect had been “immersed” in Russian disinformation and far-right propaganda, leading him to believe that if Donald Trump won the election, then-President Barack Obama would declare martial law and not recognize the validity of the election — forcing armed militias to step in to ensure that Trump became president.

  • Truth decay2016 Twitter fake news engagement: Highly concentrated and conservative-leaning

    By studying how more than 16,000 American registered voters interacted with fake news sources on Twitter during the 2016 U.S. presidential election, researchers report that engagement with fake news was extremely concentrated. Only a small fraction of Twitter users accounted for the vast majority of fake news exposures and shares, they say, many among them older, conservative and politically engaged.

  • PrivacyCloaking location on mobile devices to protect privacy

    We agree to give up some degree of privacy anytime we search Google to find a nearby restaurant or use other location-based apps on our mobile devices. The occasional search may be fine, but researchers says repeatedly pinpointing our location reveals information about our identity, which may be sold or shared with others. The researchers say there is a way to limit what companies can glean from location information.

  • PrivacyOn Facebook and Twitter, even if you don’t have an account, your privacy is at risk

    Individual choice has long been considered a bedrock principle of online privacy. If you don’t want to be on Facebook, you can leave or not sign up in the first place. Then your behavior will be your own private business, right? A new study shows that privacy on social media is like second-hand smoke. It’s controlled by the people around you.

  • The Russia connectionFacebook deletes hundreds of Russian troll pages

    Facebook announced it had shut down more than 360 pages and accounts, with some tied to the Internet Research Agency (IRA). from the United States to Germany, Facebook has come under immense pressure to combat fake news, disinformation campaigns, and hate speech on its platforms.

  • ExtremismEuropean far-right groups eschew violence to broaden appeal

    More than seventy years after the defeat of Nazi Germany, ethno-nationalist and white supremacist movements in Europe continue to thrive. They include far-right political parties, neo-Nazi movements, and apolitical protest groups. These groups’ outward rejection of violence expands the reach of their message, and  can increase the potential for radicalization.

  • Hate speechHow we built a tool that detects the strength of Islamophobic hate speech on Twitter

    By Bertie Vidgen and Taha Yasseri

    In a landmark move, a group of MPs recently published a working definition of the term Islamophobia. They defined it as “rooted in racism,” and as “a type of racism that targets expressions of Muslimness or perceived Muslimness.” In our latest working paper, we wanted to better understand the prevalence and severity of such Islamophobic hate speech on social media. Such speech harms targeted victims, creates a sense of fear among Muslim communities, and contravenes fundamental principles of fairness. But we faced a key challenge: while extremely harmful, Islamophobic hate speech is actually quite rare.

  • Terrorism & social mediaTerrorism lawsuits threaten lawful speech: 2018 in review

    By Aaron Mackey

    One of the most important principles underpinning the Internet is that if you say something illegal, you should be held responsible for it—not the owners of the site or service where you said it. That principle has seen many threats this year—not just in federal legislation, but also in a string of civil lawsuits intended to pin liability on online platforms for allegedly providing material support to terrorists.