Terrorism & social mediaStudying terrorists' social-media recruiting power in order to negate it

Published 3 July 2015

Last month a United Nations panel asked social-media companies such as Twitter and Facebook to respond to how terrorist groups use their networks to spread propaganda or recruit members with increasing success. As these terrorist groups, such as ISIS or al-Qaeda, evolve their social-media skills, the U.S. Department of Defense’s Minerva Project is funding a research project by a team of researchers who will be monitoring these groups’ advancements and trying to determine how their online actions can be negated.

Last month a United Nations panel asked social-media companies such as Twitter and Facebook to respond to how terrorist groups use their networks to spread propaganda or recruit members with increasing success. As these terrorist groups, such as ISIS or al-Qaeda, evolve their social-media skills, Arizona State University will be part of a team monitoring their advancements and trying to determine how their online actions can be negated.

ASU is leading a group project that has been awarded a Minerva grant to study of what types of information go viral online, and what types of actions or responses can halt the spread of viral information.

The Minerva Initiative is a Department of Defense-sponsored, university-based social-science research initiative launched by the secretary of Defense in 2008. It focuses on areas of strategic importance to U.S. national security.

An ASU release reports that this grant will allow the team, which includes people from the U.S. Military Academy and Britain’s University of Exeter, to study information cascades — trends marked by people ignoring their own knowledge or information in favor of suggestions from other people’s actions — as they relate to the social-media posts of terrorist networks.

“The first phase of the project is we are trying to understand what goes viral. The viral (message) is driven by two things: what type of content and what type of network. The right content and the right types of networks are going to resonate and spread and maybe gain new followers,” said Hasan Davulcu, the project’s principal investigator, and an associate professor in ASU’s Ira A. Fulton School of Engineering and director of ASU’s Cognitive Information Processing Systems Lab.

Once they understand the information cascade, Davulcu said they might be able to determine how to counter the viral messages. But, he clarifies, this study will not include developing content to thwart online terrorism. Rather, the team will be observing what organic information created by social-media users tends to halt terrorists’ viral content.

“It’s the early detection of what works for them and what works for others opposing them,” Davulcu said.

The team believes images and videos might be some of the more persuasive ways to create partisan passion.

“We are finding pictures to be extremely telling,” Davulcu said. “In fact, we are going to collect tons of photos that circulate online and put them into games so we can figure out what do people understand by the picture.”

These images could be pictures of enemies or adversaries; a photo of Ghandi to illustrate peace; or something as common as a sports star, suggesting action. Studying the use and relationships of these images could provide a lens into the diffusion of various ideologies.

“It is impossible to monitor all of the conversations, so we have to get better at identifying the ones to which we should be paying attention,” said Paulo Shakarian, a team member and director of the Cyber-Scio Intelligent Systems Lab at ASU. “This requires embedding psycho-social models in a logic programming framework that can gather and analyze social networks, specific attributes of individuals and their relationships to others.”

Shakarian understands the issue of terrorism from another perspective, as a former member of the U.S. Army who served two years in Iraq.

“The idea is that if we can understand which of the postings and messages of ISIS have the potential to go viral then we can learn to combat that much better than we do now,” he said.

ASU’s Minerva grant is situated in its Center for the Study of Religion and Conflict, which incubates new research into the complex role of religion in human affairs.

Mark Woodward, associate professor in ASU’s School of Historical, Philosophical and Religious Studies, and Baoxin Li, associate professor of ASU’s School of Computing, Informatics and Decision Systems, are also part of the Minerva team.

This is ASU’s second Minerva grant. The first was awarded in 2009 to study how to strengthen the voices of the Muslim majority who didn’t condone violence.

Davulcu and Woodward were a part of that project, as well. If this second one goes well, there could be more in the future.

“The transdisciplinary environment of ASU has really enabled us to bring together faculty in innovative ways,” said Linell Cady, director of the Center for the Study of Religion and Confict.

“The fact that we have developed two successful Minerva projects is a real testament to the way in which integrating the deep knowledge of the humanities with cutting-edge computer science can produce a whole much greater than the sum of its parts.”