Terrorism & social mediaTerrorism lawsuits threaten lawful speech: 2018 in review

By Aaron Mackey

Published 31 December 2018

One of the most important principles underpinning the Internet is that if you say something illegal, you should be held responsible for it—not the owners of the site or service where you said it. That principle has seen many threats this year—not just in federal legislation, but also in a string of civil lawsuits intended to pin liability on online platforms for allegedly providing material support to terrorists.

One of the most important principles underpinning the Internet is that if you say something illegal, you should be held responsible for it—not the owners of the site or service where you said it. That principle has seen many threats this year—not just in federal legislation, but also in a string of civil lawsuits intended to pin liability on online platforms for allegedly providing material support to terrorists.

Several federal trial courts dismissed such suits this year, but some of these cases are on appeal and plaintiffs have filed several new ones. If these suits are successful, they could be detrimental for the Internet: platforms would have little choice to become much more restrictive in what sorts of speech they allow.

Without definitive rulings that these cases cannot stand under existing law, they continue to threaten the availability of open online forums and Internet users’ ability to access information. That’s why EFF filed legal briefs in 2018 asking two different federal appellate courts to dismiss material support cases against social media platforms.

The good news: So far, courts have been quick to toss out these material support lawsuits, including the U.S. Court of Appeals for the Ninth Circuit, the first federal appellate court to hear one. Although the facts and claims vary, the majority of the cases seek to hold platforms such as Twitter, YouTube, and Facebook liable under the federal Anti-Terrorism Act.

The lawsuits usually claim that by allowing alleged terrorists to use their publishing or messaging services, online platforms provided material support to terrorists or aided and abetted their terrorist activities. A key allegation of many of these lawsuits is that the pro-terrorism content posted by particular groups radicalized or inspired the actual perpetrators of the attacks, thus the platforms should be liable for the harm suffered by the victims.

The facts underlying all of these cases are tragic. Most are brought by victims or family members of people who were killed in attacks such as the 2016 Pulse nightclub shooting in Orlando.