TERRORISMOrganizing for Innovation: Lessons from Digital Counterterrorism
This article explores five factors that were key to facilitating innovation in Facebook’s approach to countering the Islamic State—and that I argue are more generalizable. They are: people, organization, legitimacy, tools, and collaboration. It also identifies lessons that can be learned from that experience.
Over the past 30 years, technology companies built the modern internet—and with it a slew of new methods for communication and commerce. In doing so, they also inadvertently constructed new digital terrain for threat actors to exploit. In order to safeguard the communities and commerce that emerged online, and under significant pressure from governments and civil society, these companies belatedly built mechanisms to identify, disrupt, and deter those threat actors. Collectively, those activities are a key element of what professionals call Trust & Safety.a Trust & Safety is a practice of adversarial adaptation mediated by technology that often results in punitive action. And while the actions taken by Trust & Safety teams are not kinetic, the technology, organization, and centrality of technological adaptation necessary for Trust & Safety offers lessons for military leaders now and in the future.
The fundamentally adversarial nature of Trust & Safety drives innovation by attackers and defenders. When I arrived to lead Facebook’s efforts against the Islamic State in the spring of 2016, the prevailing instinct among engineers was to build AI-driven classifiers to find content supporting the group. But I understood how the Islamic State’s propaganda operation functioned, both on and off Facebook. There was a more straightforward, intelligence-driven way to disrupt the group’s formal propaganda operation, which was our initial goal. So, we used vendors to collect emerging Islamic State propaganda on Telegram; established pipelines to triage, label, and hash it quickly; and then were able to detect that propaganda as soon as it was uploaded to a Facebook server.b I asked for entirely new ways to measure operational success—built around time rather than scale—and eventually, we regularly ran that process more quickly than Islamic State supporters could upload the first instance of a piece of propaganda to Facebook.
This was a good, creative win, but it was also only a single blow in a much longer cat-and-mouse game. Predictably, the Islamic State innovated: by speeding up their process, editing core material to confound detection tools, and eventually operating on Facebook in more informal ways. The lesson is neither that AI classifiers are too clunky (they are, in fact, very useful) nor that lower-tech solutions produce partial victories. Rather, it is that technology must fit the mission and that every victory is fleeting against innovative opponents, especially online where the cost of iterating is low.