White Supremacists Speak: Recruitment, Radicalization & Experiences of Engaging and Disengaging from Hate Groups
Anne Speckhard & Molly Ellenberg This article is excerpted in Homeland Security Today. Introduction On…
Since its inception in 2014, ISIS has unleashed an unprecedented social-media recruiting drive on a 24/7 basis in over twenty-one world languages. While ISIS has lost much of the territory it once controlled in Iraq and Syria, and any claims to running a territorial “Caliphate,” its digital Caliphate continues to operate—recruiting, inspiring, and directing vulnerable individuals to continue to travel to their new battlefields and, perhaps more importantly, to the West to mount home-grown attacks. To date, very little counter-narrative material exists, and most of what exists is cognitive versus emotionally impactful in nature. In this Internet-based research, The International Center for the Study of Violent Extremism (ICSVE) counter-narrative videos were focus-tested in a sample of English- speaking ISIS endorsers, promoters, and followers on Facebook. The results of the study highlight the potential for such videos to reach individuals who like, follow, endorse, or promote ISIS materials online. Equally important, given that ICSVE counter-narrative videos are intentionally given pro-ISIS names to appeal to those who consume ISIS materials, including so that those already engaging online with ISIS propaganda will be likely to also encounter our counternarratives and get a very different message, the authors found that the videos were inadvertently shared and endorsed as ISIS content to their followers. While the relationship between the online intervention and offline behavior could not be established, meaning whether such counter-narratives may have caused ISIS followers to actually abandon or reverse their trajectories into terrorism, this study represents a significant step towards exploring how to battle ISIS in the digital space.
Keywords: Counter-Narrative, Defectors, ICSVE, ISIS, Islamic State, Social Media, Facebook, Digital Caliphate, UK.
ISIS is the most prolific terrorist group to date in terms of Internet-based propaganda and terrorist recruitment. Its propaganda promoters and publicists cast a global net across the digital space—consisting of videos, memes, tweets, and other social media postings—to first ensnare vulnerable individuals and then take full advantage of instant social media feedback mechanisms. When anyone retweets, likes, or endorses their materials, ISIS cadres swarm in and try to seduce such individuals into the group. During the course of conversation aimed at grooming for recruitment, ISIS recruiters find what is missing or hurting in the lives of those they target. Then, they offer quick fixes, such as promises of dignity, purpose, significance, salary, sexual rewards, marriage, adventures, travel, escape from problems, and promises of an important role in building the “utopian Caliphate.” They use whatever it takes and works to sway their target into beginning to serve the group’s goals. Today’s jihadi recruiters need only a computer and Internet connection to recruit, inspire, and direct terrorist attacks—even continents away.
ISIS’ unprecedented social media drive and recruitment on a 24/7 basis in over twenty-one world languages resulted in the journeying of over thirty thousand foreign fighters from more than one hundred countries to Syria and Iraq. Despite ISIS losing control of its territory, namely in Iraq and much of Syria, it continues to be successful in the digital space, escalating its calls for attacks on Western targets and inspiring homegrown and returnee foreign fighter attacks in major cities in the West, such as New York, London, Istanbul, Paris, Brussels, Nice, and Berlin—to name just a few.
Since the territorial downfall and military onslaught, according to experts who monitor their digital output, ISIS is producing 48 percent less content than in the previous two years.[] The March 2017 ISIS-inspired Parliament attack in London, however, led to a burst in ISIS-related video content. Prior to that attack, and even when under constant siege, ISIS was producing up to five quality recruiting videos per week. [] Even without new production, ISIS can rely on the thousands of propaganda products already created. In this regard, their online seduction is still operational and will likely continue to function far beyond Syria and Iraq.
In 2015, ISIS on average produced 1,000 events per month in a variety of forms, with many depicting idyllic and utopian images of life in the Caliphate, stressing economic activity, law and order, and the ability to follow Islam and ISIS lifestyle without interference.[] From September to December 2014, more than 46,000 Twitter accounts were utilized by ISIS supporters.[] In 2015, each Twitter account of an ISIS supporter had on average over 1,000 followers, which represents a much higher rate compared to a “typical” Twitter account. [] ISIS also had a strong presence on Facebook, YouTube, and other common social media platforms until many of these platforms began instituting takedown policies.
Despite monitoring and takedown policies, most social media platforms continue to host ISIS ideologues and recruiters, even if only in fleeting profiles that do not last for long. Likewise, propaganda videos are often not taken down by the sites for weeks, if not months. One expert studying the phenomena noted that ISIS propaganda videos glorifying their cadres and attacks featured for months at a time on YouTube. ICSVE researchers have easily found them as well. [] In this study, over fifty English-speaking Facebook accounts endorsing, promoting, and following ISIS were identified in only a few days of searching. Arguably, ISIS promoters and recruiters hide in plain site on commonly used social media sites.
With the recent fall of the Caliphate in Iraq and much of Syria, ISIS products hark back to the fact that they held territory and hold out the hope of doing so again in the future. Hence, ISIS continues to utilize social media accounts to assist in their recruitment endeavors, despite counter responses by Twitter, Facebook, and YouTube since 2014. ISIS-related account suspensions, while important, have not resulted in the total cessation of ISIS recruitment over the Internet. Instead, just as terrorists figured a work-around for government hardening of targets—airports, symbols of national significance, and government buildings—morphing their methods in response to counter-measures, they have done the same when it comes to their Internet recruiting and proselytizing.[] For instance, ISIS cadres avoided YouTube monitoring by using a photo in which they embedded terrorist videos.[]
Some of the ISIS responses to counter measures included the terrorists resorting to the use of encrypted platforms to attract and communicate with followers, such as through WhatsApp and Telegram. Recruiters redirect vulnerable individuals who they ensnared to follow and endorse ISIS materials on non-encrypted platforms, such as Facebook, YouTube, and Twitter and to follow them further into ISIS by moving to encrypted platforms where they interact, further bond, and often become recruited into terrorist ranks. This practice creates security nightmares for intelligence and law enforcement who find that ISIS supporters they were following on non-encrypted social media platforms suddenly go “dark.” Law enforcement and intelligence operators can no longer follow their online activities to learn how radicalized they are becoming or if they represent a serious threat. Therefore, intervening while users are still being seduced on the non-encrypted platforms such as Facebook is crucial.
Telegram is one of the platforms ISIS currently favors to spread its propaganda in encrypted chat rooms. It is in this platform that ISIS materials are shared and where recruitment occurs. The Islamic State’s presence on Telegram received extensive media coverage following the use of its official Arabic-language channel to claim credit for the November 2015 attacks in Paris. [] Following the attacks, Telegram began to suspend questionable public channels, although it still does not intercept private communications occurring between terrorist suspects.[] That said, dozens of its public channels serving terrorist aims remain functional. For anyone skilled in monitoring Telegram, it is not difficult to see that ISIS continues to host chat rooms on Telegram, where some that can be easily joined. Indeed, ICSVE researchers routinely monitor Telegram chat rooms that operate with impunity.
As ISIS continues to spread its dangerous ideology over the Internet, recruit others into its ranks, and call for simple, but horrific, homegrown attacks, discrediting the group and its ideology is essential. Despite losing significant swaths of territories, ISIS sleeper cells and active ISIS members continue to operate in war-torn Syria and Iraq, while bringing effective political solutions to the grievances that led to the rise of conflict grievances will likely take years. Unless political grievances are adequately addressed, pockets of ISIS cadres will continue to operate underground and recruit via the Internet both at home and globally.
While interviewing ISIS defectors over the past year in the ISIS Defectors Interviews Project, the authors repeatedly heard that the ISIS plan in the event of a military defeat was for ISIS cadres to shave their beards and go underground. [] Iraqi ISIS cadres interviewed between 2017-2018 by ICSVE researchers shared that many of the ISIS leaders in Mosul had fled to Syria and Turkey during the Iraqi military campaign to recapture the city. Indeed, ISIS cadres who fled Syria and Iraq have found possible new safe havens in parts of Libya, the Balkans, Central Asia, Turkey and Southeast Asia, where they operate under the radar and continue to seduce others into their ranks both through face-to-face and Internet recruiting.
ISIS is likely to continue to exist, albeit severely degraded. The second flank to be fought in the battle against ISIS is in the digital space. Efforts to mount a strong and resolute digital battle against ISIS are long overdue. What is needed for that battle is to delegitimize both the group and its ideology—to Break the ISIS Brand—as we call our project at ICSVE.
There are three types of counter-messaging frameworks or approaches that can be effectively used to counter violent extremism: alternative narratives, counter-narratives, and government strategic communications.” [] Usually run by civil society organizations and government entities, alternative narratives emphasize positive stories about democratic values, tolerance, and freedom, among others, without paying attention to terrorists and their cause. Put differently, the goal is to “undercut violent extremist narratives by focusing on what we as a government and society are ‘for,’ rather than ‘against.’[] The second, government strategic communications, refer to government efforts aimed at clarifying government policies, stance, or actions towards an issue. The strategic communications also include public safety and awareness activities. Both government strategic communications and alternative narratives can be useful for prevention of terrorism and violent extremism, but are unlikely to reach those already being seduced by terrorist ideologies and groups. This is because those already moving along the terrorist trajectory are already narrowing their focus to messaging only from terrorists. By contrast, counter-narratives target and discredit terrorist groups and their ideologies by deconstructing and demystifying their messages to demonstrate their lies, hypocrisy, and inconsistencies. Therefore, we firmly believe that the best counter-narratives are those coming from disillusioned insiders and that are not labeled as counter-narratives, so they can slip under the radar of those already consuming terrorist materials.
Research shows that narratives in general can significantly impact “beliefs, attitudes, intentions, and behaviors, ”giving support to the use of counter-narratives to fight terrorist narratives.[] Davies et al. (2016) maintained that there is a significant consensus regarding the impact of “ideological-based narratives” that play a role in increasing radicalization.[] This is not to say that ideological inclinations or tendencies represent the only path through which one comes to act violently, however, meaning we need to also understand the processes and purposes for which ideologies are often adopted. In terms of constructing counter-narratives, four criteria have been identified as important: 1) “revealing incongruities and contradictions in the terrorist narratives and how terrorists act, 2) disrupting analogies between the target narrative and real-world events, 3) disrupting binary themes of the group’s ideology, and 4) advocating an alternative view of the terrorist narrative’s target.” [] Taking into account the specific content of counter-narratives is important, “but research should also take into account why these narratives have resonance for particular individuals.”[]
While these are all valid points to consider, one must also acknowledge the fact that most, if not all, counter-narrative materials produced up until very recently have suffered from an almost complete lack of emotional impact. For instance, during the last fifteen years, the UK Home Office commissioned websites and groups to argue against al-Qaeda’s use of “martyrdom” and their calls to militant jihad using Islamic scriptures, logical arguments, and by presenting moderate views of Islam. These, however, fell flat particularly in the face of al-Qaeda’s use of emotionally evocative pictures, videos, and graphic images invoking the ideas that Islam, Muslims, and Islamic lands are under attack by the West, including that jihad is an obligatory duty of all Muslims and that “martyrdom” missions are called for in Islam.[] Equally important, the first author, for instance, as early as 2006, advised UK Prevent to use more emotionally based appeals to match and counter those of al-Qaeda. []
ISIS’ creative and emotionally evocative use of video alongside their leverage of social media feedback mechanisms made counter-narratives that are based on logical arguments pale in comparison to their own products.[] The first author of this article has been arguing for years that we must take a page out of al-Qaeda’s, and now ISIS’, playbook and use emotionally evocative counter stories that are impactful. Similar to Madison-avenue marketing, we need to target the ISIS brand while creating and redirecting vulnerable individuals to embrace better narratives or “brands.”
The most effective tool to discredit ISIS and their militant jihadi ideology is to raise the voices of disillusioned ISIS cadres themselves.[] Yet, there are many issues in doing so. Former extremists have been used effectively in CVE work and efforts, including undercover infiltration of terrorist groups that resulted in prosecutions and round-ups of terrorist cells and drone kills of terrorist ideologues.[] The FBI has also made use of a member of the Lackawanna Six convicted on terrorism charges to tell his story for purposes of prevention, although only under strict supervision.[] Similarly, the FBI recently used an ISIS defector, also under terrorism charges, in a supervised early intervention with an American youth who was heading into ISIS—tweeting for them.[]
Problems exist, however, in working with defectors and formers, including issues of trust and reliability. Some formers find it stressful to speak about their experiences while others simply refuse. Others are not trusted enough by law enforcement to be used in that capacity as they vacillate in their opinions about the terrorist group they formerly endorsed or belonged to. There are also those who are not psychologically healthy, having been seduced into the group because of needs they hoped the group would meet. Some also suffer from posttraumatic stress and substance abuse disorders after serving in conflict zones inside a horrifically brutal organization, making them unreliable interlocutors and questionable role models.
The safest and most ideal option then may be to capture defector voices on video rather than using formers for in-person interventions with vulnerable audiences, as video testimonies can be better prepared and controlled. There is no danger of the defector saying anything off message in an edited video, as his or her testimony denouncing ISIS or a terrorist group is already pre-vetted and set. Videos of a defector speaking can also be dispersed widely for face-to-face interventions, as well as Internet-based interventions, whereas an actual defector may have to be supervised by a controller who accompanies him or her. In addition, a defector is limited by time and space and can only be one place at a time, whereas videos have infinite reach and can be accessed at any time, as well as translated into multiple languages.
The U.S. State Department announced its support for raising the voices of ISIS defectors against the group in its May 2016 Department of State and USAID Joint Strategy on Countering Violent Extremism:
“The GEC [Global Engagement Center] is coordinating interagency efforts aimed at undermining terrorist messaging. … undertaking strategic campaigns to highlight the hypocrisy of ISIL’s messaging, for example, by amplifying the voices of disillusioned former fighters, family members, and victims of terrorist attacks.” []
Despite State Department and UK Prevent’s ambitious goals and efforts to counter violent extremism and terrorism, the truth is that while ISIS issues thousands of recruiting videos, memes, and tweets on an ongoing basis, there is little compelling online content countering them. What exists is mostly cognitive versus emotion-based and compelling content. What is needed is a strong campaign to first Break the ISIS Brand. Then we must also replace the ISIS call to action with alternative narratives that redirect the frustrations, anger and passions of those facing emotional, socio-economic, and socio- political problems, and who are angry over geopolitical events that resonate in their own lives to nonviolent solutions rather than the solutions ISIS is offering.
The International Center for the Study of Violent Extremism (ICSVE) team have spent the two years capturing ISIS defectors stories, most on video, in their Breaking the ISIS Brand – the ISIS Defectors Interviews Project. Thus far, the first author along with ICSVE staff have collected seventy-eight stories of ISIS defectors, returnees and ISIS cadres prisoners, and twelve of parents of ISIS fighters. These include interviews of defectors and returnees from Western European and Balkans countries, imprisoned ISIS cadres in Iraq and Kyrgyzstan, and from Syrians ISIS cadres escaping into Turkey. [] From the story of a thirteen-year-old child soldier who watched as young boys beheaded prisoners as a part of their induction into ISIS (and he likely did the same), and left because children were being tricked into dying in suicide bombings, to the European bride of ISIS who was rescued by her father when ISIS demanded that she give up her baby to their Caliphate before leaving, we strive to showcase the hardships and dangers associated with living in and serving the Islamic State.
Seventeen short counter-narrative video clips have thus far been produced from the longer videotaped interviews using a process in which the interviews are edited down to their most damaging, denouncing, and derisive content. The Breaking the ISIS Brand video clips are also made with an eye for capturing emotions. The most emotionally compelling material is selected to denounce ISIS. Then equally emotionally evocative video footage and pictures taken from actual ISIS propaganda content are added in to graphically illustrate what the defectors are narrating—to make their stories come alive and to turn ISIS propaganda back on itself. The ISIS defectors and prisoners interviewed are also asked at the end of their interviews to give advice to anyone thinking of joining ISIS, at which point they strongly and emotionally denounce the group and warn others of the dangers and disappointments of joining.
The finished video clips are named with pro-ISIS names and start with an opening screen that mirrors, or is actually taken from ISIS propaganda materials, so that an individual looking for and viewing ISIS recruiting videos might mistakenly watch an ICSVE video and be surprised to encounter a former ISIS cadre denouncing the group. The video clips are being subtitled in the twenty-one languages that ISIS recruits in, including Albanian, Arabic, English, French, Dutch, German, Malay, Uzbek, and Russian, and once produced, are uploaded to the Internet in hopes of catching the same audiences that ISIS is daily recruiting on a 24/7 basis.
Presently, the ICSVE video clips are uploaded to the ICSVE YouTube channel and rely on a process of chance for accessing the counter-narrative video clips. The Western press has covered them to a fair amount, and evidence exists based on comments and viewing counts that they are viewed by the target audience. The authors learned of one that made it to the Facebook page of a Syrian ISIS defector in Turkey and another showing up in an ISIS Telegram chat room.
The police in Kyrgyzstan have posted them on their website and used them in CVE prevention work, as have other NGOs. Similarly, Dutch and Belgian police recently trained on their use. Jordanian religious CVE workers are using them as well. Likewise, the video clips have also been focus tested in face-to-face groups with target audiences in Europe, Jordan, Kyrgyzstan, Kosovo, and the United States, yielding positive results, including opening animated discussions that allowed for a good diagnosis of radicalizing factors among the target audience, and showing evidence of influencing participants negatively against ISIS. We also focus tested them with law enforcement, intel, social workers, teachers, and CVE practitioners in Iraq, Jordan, Kyrgyzstan, Kosovo, Belgium, the Netherlands, and the United States, also with positive reception as a useful prevention and CVE tool. Lastly, we used two of them with an actual ISIS terrorist in custody in Iraq, and we found they deeply influenced the discussion with him, specifically as it pertains to discussing the brutal practices of ISIS.[]
For this study, we decided to see if we could take these offline results of the video clips online. This required finding out if we could identify ISIS endorsers and followers, and even recruiters on social media, and target placement of the videos on their social media accounts. Our guiding research question was as follows: How, and to what effect, can our ICSVE counter-narrative materials infect ISIS endorsers, followers, and promoters on social media?
We already had some evidence that we could reach ISIS followers, endorsers, and those who share ISIS materials, as we had been monitoring ISIS on Telegram channels over the previous year. However, for this study, we primarily went after Facebook accounts, and also had confidence about reaching the target audience based on a recent Facebook study targeting those who openly expressed extremist sentiments online.[] Authors of the study concluded that those wishing to fight against terrorist groups online can reach the same digital audiences that terrorists do to intervene in various ways, including to pitch them counter-narrative materials. Thus, we set out to see if we could reach the same audiences ISIS is reaching on Facebook and pitch them our Breaking the ISIS Brand counter-narrative videos.
Our view was that ISIS is adept at using the feedback mechanisms of social media; therefore, we, too, can learn their methods to use against them. Social media is a two-way road, meaning that not only can terrorists access social media, but those seeking to counter terrorists messaging can also make use of social media to fight them. Platforms such as Facebook and Twitter are interactive by design and encourage exchange between audience and transmitter, as well as being dynamic in meeting the needs of a changing target audience.[] We determined that if counter-narrative messages are to be effective in engaging the same online audiences that ISIS is reaching, the interactive and adaptive nature of social media is the best delivery mechanism in which to apply counter-narratives with the optimum scenario, including follow-up, in much the same way that ISIS swarms in on vulnerable persons, although in our case with responses that redirect vulnerable persons away from ISIS.
Confident that we could reach the target audience, after identifying them, we determined our goal would be to briefly study their degree of radicalization and online behaviors, and to ultimately learn the effects of placing our video clips on their pages. In other words, the goal was to learn if our counter-narratives might make any discernable difference in their observable behaviors, and to learn what, if anything, happens.
Our general research design was exploratory in nature given the novel type of research involved. Three approaches in total were used to explore the digital environment on Facebook, each developed in succession of the previous one. Our goal was to identify the most suitable method to counter ISIS’ ideology on Facebook and to learn if our counter-narratives could make a positive impact that we could measure, or at least observe.
Facebook was chosen for three reasons. First, Facebook serves almost 1.5 billion people globally and enables an extensive outreach to people in the whole world. Second, Facebook is a valuable resource for researchers as it provides a significant amount of personal data that can be utilized for various research projects, such as this one. Lastly, Facebook is a popular social media platform used by terrorist groups, aka “Media Mujahedeen,” who disseminate ISIS propaganda on a 24/7 basis. Despite takedown policies, groups like ISIS still use fake and fleeting profiles on platforms like Facebook to target potential vulnerable recruits, realizing that if they do not manage to ensnare them before being taken down, they will have to continue the process via some other digital means, usually an encrypted service. However, platforms like Facebook allow the widest reach for initial encounters, thus terrorists still invest in them despite the challenges. In this regard, Facebook represented a suitable social media platform to conduct this intervention.
Expressions of Radicalization: Given that this was an online study, it was difficult to engage in a critical and empirical evaluation to identify with utmost certainty individuals at risk for radicalization or terrorism. In practice, it would be difficult, regardless of the tools used, to effectively assess the level of extremism or a set of social, psychological, or other predisposing vulnerabilities of our target audience just by researching online. Establishing the true identities of account holders, for instance, is impossible to verify. We were cognizant of the fact that law enforcement officials, journalists, and others may be posting with fake Facebook accounts, as radicalized individuals to conduct similar investigative work and not actually represent radicalized persons at all. In addition, a recruiter may have created multiple Facebook accounts to spread ISIS ideology. In this regard, multiple accounts may not necessarily represent many persons, or the persons they appear to be, but instead represent the same ideologue behind many accounts, or being run by a bot or represent a counter-terrorism researcher, police, or journalist. The case may also be that accounts were created to impersonate someone following their arrest or death. However, despite all these possibilities, for this study we decided to treat all Facebook accounts as representing real persons, whether or not they actually were, and all expressions of radicalization at face value with our research ethics being that anyone expressing support for ISIS, endorsing ISIS propaganda, or distributing it was unlikely to be harmed by receiving counter narrative material. In fact, their life and lives of others they might harm by following the terrorist trajectory might be saved by an intervention such as ours..
As the aforementioned were difficult to verify or deconstruct, for this study, we simply took all account holders at face value and assumed if they were liking, sharing, posting and endorsing ISIS materials, then they were indeed representing radical individuals and in a one-to-one ratio. Such materials included:
We found many accounts that openly denounced Western involvement in Iraq and Syria. Such account holders in many cases expressed extremist worldviews. The focus of our research, however were specific accounts and imagery that clearly expressed support for ISIS and ISIS-related violence. Equally important, while we found a considerable number of accounts communicating support for the terrorist group, it does not necessarily mean that such individuals will resort to violence. The path towards terrorist violence is more complex, individual-specific, and requires a variety of triggers, yet we deemed openly communicating support for ISIS was enough to justify a counter narrative intervention. While it is often difficult to detect when an online person shifts his or her position towards supporting ISIS—that is, how individuals exhibit radicalized behavior on social media—we also considered whether our target users shared material from known pro-ISIS accounts. Lastly, we made an attempt to distinguish between serious ISIS supporters and endorsers and casual chatters, often referred to as “white noise,” which may be considered irrelevant and extraneous information collected by just browsing the Internet.
Based on their postings, including pictures that appeared to locate them in the UK, a majority of the Facebook users appeared to be residing and operating out of UK. That said, it is difficult to establish the natural origin of the account holders; therefore, we identified such accounts on the basis of language spoken (e.g. English). We could not specify with certainty the country of origin, with the exception of some cases when some other available information was drawn from the accounts of individuals (e.g. by looking at their network list). Given that our researchers relied on the snowball sampling technique, they first followed original accounts and then their network of followers, and so on. This is not to say that our snowball technique captured the entire network of UK-based ISIS supporters and endorsers on Facebook, but it did capture a sizeable enough sample for this research.
To ensure that we reach our intended audience, the authors relied on a stringent research design, outlined further below, and expert verification. The following indicators were used to measure the success of our intended objective:
Desired but not verified indicators included:
Arguably, it may be difficult to ascertain that the aforementioned serves as a measure of success for our online counter-narrative intervention. However, one could argue that such indicators can be taken as a measure of success given our main objective—that is, to assess the extent to which our counter-narratives resonate with our target audience and cause them to be disgusted by ISIS. Another measure of success is that if that our video materials reached to individuals who are further into the radicalization trajectory and not casual chatterers or interlocutors simply discussing the war in Iraq or Syria or contentious social, political, or religious issues. If these were serious ISIS supporters and promoters, any engagement with our video materials on their part can be considered a measure of success given that at that point on the terrorist trajectory they will most likely have narrowed their focus only to material coming from ISIS.
Sampling was conducted by setting up several anonymized Facebook accounts and identifying through search mechanisms English speaking radicalized individuals. This was achieved by searching for certain keywords on Facebook, such as “servant of Allah,” “soldier of Allah,” “khilafah [caliphate], “etc., and then befriending these accounts. A snowball sampling method was utilized as initial accounts were identified. For instance, once a “radical” account was identified, the friend’s list was searched for similar accounts and likewise added to the sample.
Criteria for the sample inclusion included posting and endorsing ISIS materials. These criteria were used to decide if an account represented a “radicalized” individual applicable for the study and to rate the degree of radicalization. As identities of account holders were impossible to verify or deconstruct, as mentioned previously, we simply took all account holders at face value and assumed if they were liking, sharing, posting and endorsing ISIS materials, then they were indeed representing radical individuals and in a one-to-one ratio.
All of the Facebook users admitted into the sample were operating in the English language, and most, based on their postings, appeared to be U.K. residents and are believed to reside in the U.K.
As with all our research at ICSVE, our main research ethics is to do no harm. In this particular study, our guiding research ethics was that anyone expressing support for ISIS, endorsing ISIS propaganda, or distributing it was unlikely to be harmed by receiving counter narrative material, and in fact, their life and lives of others they might harm by following the terrorist trajectory might be saved by an intervention such as ours. Thus we were justified in following our research design.
Likewise, the research was conducted with a careful consideration of ethical concerns related to data collection, protection of research participants’ identities, and data storage. Our research design was fully rooted in ethical guidelines for academic research, including guidelines on Internet research with vulnerable and extremist individuals online.
To ensure confidentiality of our research participants, we did not include categories of personal information (e.g. date of birth, an alleged commission of a crime by a participant, proceedings for offence committed and court sentences related to our participants, etc.) that would be considered as sensitive personal information under law (e.g. EU) We also excluded personally identifiable information on our research participants’ social networks as it would make it easier to identify their identities, even in the case of those who utilized pseudonyms in their accounts. In this regard, we only focused on the general picture—that is, on the ecology of collective behavior of our research participants as opposed to individuals. Our goal remained on identifying information relevant to our central research questions and assessing the overall impact of our counter-narrative materials on ISIS-endorsing and ISIS-supporting individuals identified on Facebook.
Given the importance of evidence-based research on CVE, using anonymized accounts was necessarily inevitable as highly radicalized individuals are highly unlikely to accept friend requests from openly identified researchers. Also, it was necessary to protect the identity of the researcher when dealing with highly radicalized individuals who publicly endorse ISIS and the use of violence, as in some cases these account holders are dangerous and may even be currently fighting abroad with ISIS or threats locally.
Despite the fact that our intervention took place in an online environment, the research involved considerable amount of risk. To minimize potential risks, we established pseudonyms and anonymized Facebook accounts to gain access to our target group, worked with a virtual private network that allowed us to 1) reach out to our target group without making them aware of our intention and 2) minimize the potential for risk for the researchers. That said, because the names of the researchers following the publication of the findings may become known to some of the subjects studied, there is always a possibility of risk of being exposed to physical danger in real life.
We also considered the legal ramifications of engaging in online research and the potential to violate counterterrorism laws of countries where our research participants came from. To achieve that, we developed research protocols that ensured a significant degree of protection. To demonstrate, we only uploaded our videos into respective Facebook accounts and avoided dialogue with our participants, which could have potentially led to legal issues and implications. In this regard, we made sure that our infiltration into Facebook account spaces did not constitute an entrapment. Moreover, given that law enforcement might have been active on Facebook seeking for actionable intelligence, we only collected data deemed to be critical to our research, all in an effort to avoid interfering with any ongoing law enforcement activities and investigations online. Equally important, our research was conducted over a relatively short period of time, and we avoided extended contact with our participants to potentially avoid legal or ethical implications.  (UNODC, 2012; Stern, 2003; Davis et.al.,2010; Markham & Buchanan, 2012; BPS, 2014).
We were also confident that our counter narrative interventions would not be mistaken by law enforcement as ISIS- related activity, as our counter-narrative videos and online interventions are also known to law enforcement officials from the EU and FBI. We regularly train with such entities and discuss our research results.
In addition to a number of aforementioned measures introduced to facilitate data collection and ensure an ethical research process without placing us, as researchers, or our research participants at risk, we also ensured that our data are stored properly and that no unauthorized persons aside from the research team had access to the data. We also ensure long-term storage and securing of the data collected (e.g. password-protected files, storing data on secured servers, etc.).
Our research design broke into three distinct components as we found ourselves continually having to morph our approach in response to ISIS counter-moves.
The first approach was based on a stringent research design that allowed for the testing of whether ISIS defector video clips might have any impact on already radicalized individuals. This approach was comprised of four steps. The first step involved setting up four anonymized Facebook accounts (two male and two female) and identifying radicalized individuals by searching for certain keywords on Facebook, such as “servant of Allah,” “soldier of Allah,” “khilafah [caliphate],” etc. The four accounts were comprised of two observing and two intervening accounts. The observing account was meant to operate “under the radar,” meaning that no posts, shares, comments, or likes were made that were directly related to the sample. The intervening account was solely responsible for doing the exact opposite: to gain attention (increase the likelihood of them watching our video) and to post the counter-narrative. Some of the criteria mentioned in the preceding section, including posting and endorsing of ISIS materials, were used to decide if an account represented a radicalized individual. Such criteria were applied to the accounts that came up in search to create a sample under this design.
Second, to measure the impact of the intervention, a radicalization matrix was developed for the purpose of determining the “degree of radicalization” within the sample. In this regard, posts, comments, shares, and likes were used as metrics. If posts, comments, shares or likes endorsed ISIS’ ideology or actions, then the metric was coded as 1 (“yes”), otherwise they were recorded as 0 (“No”). In addition, a confidence metric was established to measure the confidence of the researcher. The confidence metric was solely based on the amount (numbers) of evidence of these incidents of liking, posting, and sharing of ISIS materials (all captured in screenshots). The more evidence, the higher the confidence, and vice versa.
The third step involved the intervention itself, which was planned to be two video-uploads of the ISIS defector counter-narrative video clips under the following titles: The Glorious Cubs of the Caliphate and A Sex Slave as a Gift for you from Abu Bakr al- Baghdadi. Additionally, the sample was tagged in the video to ensure that the video was watched.
The last step was to involve an analysis of the intervention. In this approach, we learned that it was relatively easy in less than a week’s time to assemble a sample of nearly fifty English speaking Facebook accounts that openly endorsed, followed, and shared ISIS propaganda—i.e. a sample of those we treated as actual radicalized individuals whether or not they in fact were. We were able to befriend and observe these accounts relatively easily. However, the moment we started our interventions with some such users, our accounts were quickly disabled by having our accounts unfriended and possibly by having these radicalized account holders reporting us to Facebook as anonymized accounts.
The second approach was a close repeat of the first approach, but was more focused on testing defector video clips with highly committed and radicalized individuals who were selected on the basis of already being members of a closed ISIS-following and endorsing Facebook group. One of our original four anonymized accounts had managed to penetrate this group, and by friending our fourth account was also allowed in. At that point the other two accounts had been disabled in our first foray.
This second approach inside a closed-ISIS supporting Facebook group allowed us to test the results and potential impact of posting the defector video clips in the ‘echo-chamber’ of account holders who were already quite serious in their endorsement of ISIS. We expected them to be unlikely to be shaken by any counter-narrative materials given their commitment to ISIS.
The closed Facebook group in this second intervention was comprised of approximately 436 individuals and was led by eight admins in total. The group serves its members in several purposes, namely to provide the members a live news feed of events in Syria and Iraq, glorifying ISIS soldiers and “martyrs,” and disseminating propaganda material against Western enemies of ISIS. This closed group likely represents individuals who have already narrowed their focus and media consumption solely to providers of terrorist-supporting news and ideology, individuals who may already not be open to other ideas. They provided an ideal group in which to test our videos that are named with pro-ISIS names and start with an opener that mimics ISIS, but then has a defector denouncing the group. We name our videos with pro-ISIS names, so that those already engaging online with ISIS propaganda will be likely to also encounter our counternarratives and get a very different message. The idea was that intervening in this manner may be the only thing that can still reach them.
Generally speaking, this approach was quite similar to the first approach. However, it did not require a pre-analysis, as we were already confident that the account holders were radicalized given they had joined this group (unless they were security professionals or journalists posing as radicalized individuals) and we were thus only interested in their reactions to the defector video posted, via a post-analysis of their responses.
Unfortunately, in this second case, it is not clear whether the group admins expelled the anonymized account or Facebook shut down the whole group, as the group has disappeared. As predicted, we found that the ISIS-like names and accompanying ISIS looking thumbnails of our counter narrative videos led some of the heavily radicalized individuals to mistake our counter narratives as actual ISIS content and share or endorse our materials without realizing it was a defector denouncing ISIS. In fact, to post the video, one of the eight admins had to approve it. Basically, such approval process ensured that no one could post content that violates the group’s “policy.” Nevertheless, the defector video was successfully uploaded into the group, and most importantly, approved by one of the admins. We found this quite important to this type of work.
In either case, regardless of whether the group admins or Facebook removed us from the group, there were two interesting findings. Some in the group approved of the counter-narrative video—also liked and shared it—but afterward realized what it was and then denounced it. But we were able both to penetrate an ISIS echo chamber and reach its members with our counter narrative materials.
We designed the last approach to target so-called “fence-sitters.” Fence-sitters were defined as those who not only “Like” ISIS- affiliated Facebook pages but also “Like” pages that also represent and/or disseminate liberal and democratic views. By their liking history, it appears they have not totally narrowed their focus of news consumption or friends to only ISIS-related groups, and thus would possibly be more amenable to an intervention to turn them back from ISIS. This was also done with the purpose of avoiding confirmation bias in our research.
The set-up of this approach was as follows: First, four new anonymized Facebook accounts were created (2 male and 2 female), allowing the researchers to befriend accounts of interest by protecting the researcher’s identity and making enough anonymized accounts to keep observing even after one account was unfriended. Second, using various social media intelligence techniques, fence-sitters were identified and friended from each of the four anonymized accounts.
Using four accounts was necessary, as our two previous research efforts demonstrated that Facebook anonymized accounts were not long lasting and possibly due to being reported by others to Facebook, ultimately resulting in a disabled account. Similar to our previous approaches, we gendered our anonymized accounts half-and-half. Learning from the first approach that some accounts only prefer friend requests from women, whereas others only from men, we used both genders to allow us to overcome this challenge. Lastly, we were curious in the sample used—that is, whether some might react differently to the anonymized female account as in comparison to the anonymized male account—even though the same post was made. Put differently, this design allowed us to examine whether gender has an impact when posting counter narratives given that both gender types were used.
Once a representative sample was gathered, one post per day that disseminated the counter-narrative materials was made. The first two posts were short ISIS defector video clips, each showcasing the first-person accounts of why the defector joined ISIS, what he experienced there, why he eventually defected from ISIS, and a statement denouncing the group as corrupt and non-Islamic. The third, fourth, and fifth posts were made up of Internet meme/posters that used a statement from the transcripts of the interviews coupled with a picture of the defector. These can be seen below.
The immediate reactions of the sample allowed us to examine the effectiveness of these counter-narratives by looking at likes (including emotions), shares, views, and comments after the post was made.
The main objective for this set of online interventions (as well as a parallel effort undertaken with Albanian Facebook account holders) []was to learn the extent to which online counter-narratives could be targeted to radicalized individuals, and if they would have any effect. We learned that we not only could easily discover and communicate with those whose online behaviors led us to judge them to be radicalized Facebook account holders, but also could in the very short-term target them with our counter-narrative materials—that is, until they unfriended us, had our accounts disabled, or both
In the first two rounds of research, four anonymized accounts were used. In these rounds, we found that as soon as ISIS supporters understood that our accounts were there to discredit ISIS, they were, in the majority of cases, very quickly unfriended. They also called us names, made threatening remarks, and two of our accounts were disabled, possibly from being reported by ISIS supporters to Facebook as anonymized accounts. Without the protection of our accounts by Facebook, we could not follow the longer-term impact of our counter-narrative materials as we lost access to the data in our disabled accounts.
In this study, one anonymized “male” account was disabled by Facebook right at the beginning of the study. The first “female” account remained online for approximately two weeks. This account befriended about 42 individuals in the target group, allowing enough observation time for analysis of thirty of those profiles, including posts, comments, shares, and likes. Overall, our sample of who we judged as radicalized Facebook account holders had 1104 friends on average, with a minimum of 0 and a maximum of 3718. It is important, however, to note that the exact average remains unknown as some accounts did not disclose the number of friends or only displayed the number of followers. Furthermore, having established a confidence in the judgment of the researcher, twenty accounts were classified as highly radicalized. Five accounts were classified as “medium” radicalized, and the remaining five accounts were classified as “low” radicalized, with a corresponding confidence as medium or low. The sample covered countries such as the UK, U.S., Indonesia, Somalia, the Philippines, and Jamaica. Important to note is the fact that most of the account holders tried to hide their nationality and location. However, some of them verified their accounts with their phone numbers, which could give a hint to their whereabouts. Furthermore, most of the accounts did not provide any details about marital status, education, country, and place of residence.
In this sample, we accessed thirty accounts in total (all English-speaking). Based on the information we could glean from their accounts and postings, seven were judged to be from the UK. Thirteen we could not determine as there were no hints given. Four from the United States. One each from Belgium, Somalia, India, and Libya, and two from the Philippines. The observations of those thirty profiles is discussed further below.
The last and remaining “male” account was used to gain access to the highly radicalized social network, the ISIS “echo chamber” described above. In this respect, the account managed to get access to a closed group that appeared to be a “live news feed” that aims to inform ISIS supporters on an hourly basis about everything that concerns ISIS. As the second account could not be recovered, all data regarding the intervention in the first sample collected was lost. This particular closed Facebook group was used as an experimental group for the purpose of testing the ISIS defector video clip (A Sex Slave as a Gift for You from Abu Bakr al-Baghdadi).
To overcome the problems of a ban from the group, the fourth remaining anonymized account (female) was added to the closed group via the third anonymized account. The posting of the video on the third anonymized account’s timeline—and also in the group—resulted in an attempt to hack the anonymized account. A Facebook notification was sent to the anonymized account stating that an attempt has been made to compromise it. To overcome future hacking attempts, the password was changed to a strong combination of letters, numbers, and signs. Overall, the third and fourth anonymized account remained in place during the duration of an observation period for approximately two weeks.
With all eight accounts and the three different approaches we learned that it is possible to not only discover radicalized profiles of ISIS supporters and promoters on Facebook but also to infect them with counter-narratives. Two of our anonymized accounts were able to penetrate closed Facebook groups following and propagating ISIS—related materials and infect their echo chambers of watching and listening to only ISIS—related materials by inserting the defector videos and counter-narrative memes. We also learned that our videos were clicked on, watched, shared, and even endorsed due to their pro-ISIS names. However, as our accounts were disabled, we were not able to discern, except in the closed group, whether they had any positive effect. To learn this would require using multiple friends anonymized accounts or protection provided by Facebook to prevent disabled accounts.
Concerning the first approach, the first video was posted at 6 pm (GMT) by tagging a sample of 43 individuals. From those 43 individuals, 30 profiles were subjected to a critical pre-analysis. More specifically, their profiles, posts, comments, shares, and likes were analyzed for the purpose of finding links to ISIS that endorses their beliefs and actions. From those thirty accounts, nearly all had at least one link to ISIS-related content on Facebook or other accounts. Some of the profiles tried to disguise their affiliation by not mentioning the word ISIS, caliphate, or khilafah, however, an analysis of their likes and social networks yielded different results. In other words, most of the profiles were considered, with high confidence, as highly radicalized.
Analyzing the immediate reactions of Facebook users has yielded two results: First, a lot of users have probably only watched the beginning of the video clip, but not the rest of it. Seven Facebook users liked the video. These Facebook users were deemed to be highly radicalized and showed a supportive attitude towards ISIS. Given the fact that the video clip is about the exact opposite, it could be assumed that those seven users have not watched the entire video. In this regard, a female user commented to another user: “And if the police come knocking, where do I say I was with you?” This statement underlines the assumption that the sample did not watch the complete video clip but rather assumed by its title that it is ISIS content and therefore it was probably liked. This confirms that naming the counter narrative clips with pro-ISIS names is a good strategy. The user who commented on the video also unfriended the anonymized account used when she discovered her mistake in endorsing it.
Similar to the results in the first approach, it seems that the videos were not watched in their entirety by highly radicalized and committed individuals, although we are not certain on that score. This becomes especially apparent in the closed news feed group that was infiltrated by the two anonymized accounts. To post the video, one of the eight admins had to approve the video. This is basically done to ensure that no one can post content that violates the group’s “policy,” or in this case, their extremist beliefs. Nevertheless, the defector video was successfully uploaded into the group, and most importantly approved by one of the admins again, confirming the strategy of using pro-ISIS names on counter-narratives to reach the target audience. It is ironic that the admin approved a video that directly contradicts what the group was initially set up for.
The third approach with fence-sitters yielded different results. People were more likely to watch the whole video compared to the other two approaches. This becomes apparent through comments made and emotions expressed (e.g. sad or angry faces). Comments were made criticizing the authenticity of the video—the first ever in global focus-testing that questioned the authenticity of the defectors appearing on the video. Arguably, this implies that the video had been watched, including the parts in which the defector denounces ISIS, as in comparison to the first two approaches in which account holders simply liked and shared the video. When watched entirely, questions about authenticity may have been the only way to attack the videos as the content is too damning if the speaker is authentic.
Overall, three observations were made. First, the use of male and female accounts are necessary as some female accounts clearly state on their bios that they only accept requests from “sisters,” with the same holding true for males only accepting other males as Facebook friends. It is assumed this reflects adherence to conservative Islamic principles of modesty and propriety. We respected these notices and did not test whether these account holders actually do as they say.
Second, posting as male accounts yielded fewer responses from those receiving the post compared to female accounts. By using male accounts, videos were simply liked and shared—without even watching the video (approach one and two). On the other hand, when posting from a female account, comments such as “You have nothing to do with Islam […] If you think hijab is not fardh [Islamic obligation] then you become five stars kafirah [unbeliever]” were made. In this respect, it is important to mention that the profile picture used depicted a young girl from the back who was not wearing a hijab, which made an opening for the possibility of attack. It could be argued that the comment was directly targeting the issue of not complying with his religious beliefs. This also implies that the use of a male account to post is much simpler in terms of posting counter-narratives as it overcomes this gender-based problem. Lastly, using a female account yielded more reactions in terms of likes and views compared to using a male account. In this regard, it is also important to mention that we received approximately forty message requests to open and start a conversation. We believe these messages were mostly flirtatious in nature. We declined to get into conversations in this study. Nevertheless, using female accounts was more effective in terms of disseminating counter-narratives than male accounts measured in terms of likes and views. We did not test if women talking to women, or women talking to men, made a difference but intend to do so in future studies.
This study begins a broader attempt to learn if ISIS defector counter-narrative videos and Internet memes/posters can be used to reach ISIS endorsing, following, and sharing social media accounts and, if so, could some who could fall prey to the group possibly be turned back from continuing down the terrorist trajectory. The goal of this study was exploratory in nature—that is, to learn whether ISIS endorsers could even be identified on Facebook, to learn if they could be reached, and to learn how they would respond to counter-narrative materials. This is one of the first online focus-group studies conducted to begin exploring how specific counter-narrative materials might affect Facebook accounts likely representing already radicalized individuals. While we learned that we could infect ISIS dominated digital territory, we also learned that they could take counter-measures against our interventions and shut our accounts down unless they received some sort of protection from Facebook or we as researchers were willing to use non-anonymized accounts risking our own safety and security to reach them. We also learned that targeted posts were liked, shared, and watched—at least partially.
Future focus groups will continue to focus on different nationalities and language groups as well as build upon what we learned in this study. Our hypothesis was that we could affect radicalized individuals with the stories of real-life ISIS defectors who speak from personal experience about what life is really like as a fighter for the Islamic Caliphate, and that these stories are the most influential means to dissuade others from falling captive to its message. []
Our research revealed that our counter-narratives resonate with our target audience. They also reached to individuals who are further into the radicalization trajectory. If our Facebook target audience were serious ISIS supporters and promoters, any engagement with our video materials on their part can be considered a measure of success given that at that point on the terrorist trajectory they will have narrowed their focus only to material coming from ISIS. While we have established a certain measure of success as far as our intended objectives, we do not yet know exactly how they are received as we could not watch the long-term effects. It does appear that while we could entice ISIS endorsers into opening and even endorsing and sharing our counter-narrative videos due to their pro-ISIS names and appearance as coming from ISIS, we do not know if they made any change in their minds, online or offline behaviors. It may be that those who are already deep into ISIS ideology, particularly those who are in prison or in ISIS territory, are unlikely to be swayed by these videos. That said, those just dabbling in ISIS may be able to be turned back. Given we found that many of the friends of those we intervened with also clicked on our videos this is an important consideration. We do know from other research carried out by ICSVE, that the counter-narrative videos deeply affected an ISIS emir in prison who watched them, so we cannot rule out that highly committed ISIS cadres would not be affected by their reaching them via the Internet.[] Clearly, more research is warranted in this respect.
In other research, we know that there have been reversions from forward motion on the terrorist trajectory by using Internet- based interventions. [] Our study in many ways replicated their findings—that such individuals can be reached. However, the online effects of our materials are still not well understood due to the difficulty we found of continuing to observe the results of posting these materials on the accounts of ISIS endorsers. Our goal is to continue producing and testing ISIS defector counter-narratives on and offline. Given the importance of pulling individuals back from their movement into radicalization to actually carrying out violent extremism, we believe this study proves an important first point—that this target audience can be reached. Moreover, our online intervention, and similarly oriented studies, represents a unique opportunity to reach out to individuals who might not otherwise seek help, or who might seek help in all the wrong places. We also remain hopeful that our interventions online could potentially lead to creative ways of working to reach and redirect those already on the extremist trajectory back off of it.
Once we find the right combination of counter-narrative materials and distribution channels, we then can begin to halt movement toward violent extremism in some cases and add one more tool to the toolbox of fighting violent extremism as well as halting and reversing movement on the terrorist trajectory. That said, it is also important to note that simply breaking the ISIS brand is not enough. Individuals resonate to terrorist ideologies and join terrorist groups out of serious socio, economic and political concerns and real emotional needs. Unless we begin addressing the underlying psycho-social and political issues that push and pull individuals to consider terrorist movements and violent solutions to such contentious issues, we will not be successful in significantly reducing terrorism. Our Breaking the ISIS Brand Counter Narrative videos—discrediting the group and its ideology—is a useful first step to redirecting individuals to address their concerns without resorting to terrorism or violence. This study was a successful first attempt to begin the battle against the ISIS digital Caliphate and to ultimately break the ISIS brand.
Anne Speckhard, Ph.D., is an adjunct associate professor of psychiatry at Georgetown University School of Medicine and Director of the International Center for the Study of Violent Extremism (ICSVE). She has interviewed over 600 terrorists, their family members and supporters in various parts of the world including in Western Europe, the Balkans, Central Asia, the Former Soviet Union and the Middle East. In the past two years, she and ICSVE staff have been collecting interviews with ISIS defectors, returnees and prisoners, studying their trajectories into and out of terrorism, their experiences inside ISIS, as well as developing the Breaking the ISIS Brand Counter Narrative Project materials from these interviews. She has been training key stakeholders in law enforcement, intelligence, educators, and other countering violent extremism professionals on the use of counter-narrative messaging materials produced by ICSVE both locally and internationally as well as studying the use of children as violent actors by groups such as ISIS and consulting on how to rehabilitate them. In 2007, she was responsible for designing the psychological and Islamic challenge aspects of the Detainee Rehabilitation Program in Iraq to be applied to 20,000 + detainees and 800 juveniles. She is a sought after counterterrorism experts and has consulted to NATO, OSCE, foreign governments and to the U.S. Senate & House, Departments of State, Defense, Justice, Homeland Security, Health & Human Services, CIA and FBI and CNN, BBC, NPR, Fox News, MSNBC, CTV, and in Time, The New York Times, The Washington Post, London Times and many other publications. She regularly speaks and publishes on the topics of the psychology of radicalization and terrorism and is the author of several books, including Talking to Terrorists, Bride of ISIS, Undercover Jihadi and ISIS Defectors: Inside Stories of the Terrorist Caliphate. Her publications are found here: https://georgetown.academia.edu/AnneSpeckhardWebsite: and on the ICSVE website https://www.icsve.org Follow @AnneSpeckhard
Lorand Bodo, MA is a research fellow at ICSVE looking at ISIS social media use and testing ISIS defector video clips with those who endorse ISIS. He was responsible for implementing the technical aspect of the research design for this study. Lorand has carried out experimental interventions with ISIS endorsers, supporters and dispensers of ISIS propaganda on Facebook, infesting their territory with ICSVE ISIS defector counter-narratives with good success. He is currently pursuing a four-year integrated Ph.D. at the Security and Crime Science Department at University College London. His research focuses on identifying and examining online radicalizing settings on social media platforms. He graduated with distinction in Governance and International Politics (MA) from Aston University and is about to finish his second M.A. degree in Politics, with a focus on the Governance of Complex Innovative Technological Systems from the University of Bamberg. In addition, he is currently working for the OSINT research team at the International Centre for Security Analysis at King’s College London, where he conducts research on OSINT and SOCMINT with a focus on various security-related issues globally. Lorand is fluent in Hungarian, German and English.
Reference for this article: Anne Speckhard, & Lorand Bodo (March 8, 2018) Fighting ISIS on Facebook—Breaking the ISIS Brand Counter-Narratives Project, ICSVE Research Reports
[] Joel Wing, “How is the Islamic State Dealing with its ISIS Defeat in Mosul? Interview with Charlie Winter on ISIS Media Output,” Musings on Iraq, March 7, 2017; URL: http://musingsoniraq.blogspot.co.uk/2017/03/how-is-islamic-state-dealing-with-its.html.
[] Craig Whiteside, “New Masters of Revolutionary Warfare,” Perspectives on Terrorism 10, no. 4 (2016): URL: http://www.terrorismanalysts.com/pt/index.php/pot/article/view/523/html; Charlie Winter, “Documenting the Virtual ‘Caliphate,” Quilliam, October 2015; URL: http://www.quilliaminternational.com/wp-content/uploads/2015/10/FINAL-documenting-the-virtual-caliphate.pdf.
[] J.M. Berger and Jonathon Berger, “ The ISIS Twitter Census: Defining and Describing the Population of ISIS Supporters on Twitter,” Brookings, Center for Middle East Policy, March 2015; URL: https://www.brookings.edu/wp-content/uploads/2016/06/isis_twitter_census_berger_morgan.pdf.
[] J.M. Berger and Heather Perez, “ The Islamic State’s Diminishing Returns on Twitter: How Suspensions Are Limiting the Social Networks of English-Speaking ISIS Supporters,” George Washington University, February 2016; URL: https://cchs.gwu.edu/sites/cchs.gwu.edu/files/downloads/Berger_Occasional%20Paper.pdf.
[] Rachel Briggs and Sebastien Feve, “Review of Programs to Counter Narratives of Violent Extremism: What Works and What the Implications for Government,” Institute for Strategic Dialogue,” 2013; p. 6, URL: file:///C:/Users/Me/Downloads/CounterNarrativesFN2011.pdf
[] Kurt Braddock and James Dillard, “Meta-Analytic Evidence for the Persuasive Effect of Narratives on Beliefs, Attitudes, Intentions, and Behaviors,” Communication Monographs 83, no. 4 (2016): p.18, doi:10.1080/03637751.2015.1128555.
[] Davies et al., “Toward a Framework Understanding of Online Programs for Countering Violent Extremism,” JD Journal for Deradicalization, no.6 (2016): p. 51; URL: http://journals.sfu.ca/jd/index.php/jd/article/view/43/38.
[] Kurt Braddock and John Horgan, “Towards a Guide for Constructing and Disseminating Counter-Narratives to Reduce Support for Terrorism,” Studies in Conflict & Terrorism 39, no.5 (2015): p. 397, doi:10.1080/1057610X.2015.1116277.
[] Anne Speckhard, “De-Legitimizing Terrorism: Creative Engagement and Understanding of the Psycho-Social and Political Processes Involved in Ideological Support for Terrorism,” Journal of Democracy and Security 3, no. 3 (2007): 251 – 77.
[] Anne Speckhard, “The Hypnotic Power of ISIS Imagery in Recruiting Western Youth,” ICSVE Research Reports, May 2015; URL: https://www.icsve.org/research-reports/the-hypnotic-power-of-isis-imagery-in-recruiting-western-youth/; Anne Speckhard, “Brides of ISIS: The Internet Seduction of Western Females into ISIS,” Homeland Security Today 13, no. 1 (Dec/January 2016): 38-40; URL: http://edition.pagesuiteprofessional.co.uk//launch.aspx?eid=0d492b24-092f-4b2c-8132-b3a895356fc8.; Anne Speckhard, “The Militant Jihadi Message Propagated by ISIS is a Contagiously Virulent Veme in the West—the Ebola of Terrorism.” Multi-‐Method Assessment of ISIL (2015); URL: https://www.researchgate.net/publication/271195840_The_Militant_Jihadi_Message_Propagated_by_ISIS_is_a_Contagiously_Virulent_Meme_in_the_Westthe_Ebola_of_Terrorism; Anne Speckhard, “The Lethal Cocktail of Terrorism: The Four Necessary Ingredients That Go into Making a Terrorist & Fifty Individual Vulnerabilities/Motivations That May Also Play a Role,” International Center for the Study of Violent Extremism: Brief Report (February 25, 2016); URL: https://www.icsve.org/brief-reports/the-lethal-cocktail-of-terrorism/
[] Anne Speckhard, Ardian Shajkovci, and Ahmet S. Yayla, “ Defeating ISIS on the Battle Ground as well as in the Online Battle Space: Considerations of the “New Normal” and Available Weapons in the Struggle Ahead,” Journal of Strategic Security 9, no.4 (2016): 1-10, doi: http://dx.doi.org/10.5038/1944-04126.96.36.1990; Anne Speckhard, Ardian Shajkovci, and Ahmet S. Yayla, “ Following a Military Defeat of ISIS in Syria and Iraq: What Happens Next after the Military Victory and the Return of Foreign Fighters,” Journal of Terrorism Research 8, no. 1 (2017): 81-89, doi: http ://doi.org/10.15664/jtr.1341; Anne Speckhard and Ahmet S. Yayla, ISIS Defectors: Inside Stories of the Terrorist Caliphate.
[] See for example: Anne Speckhard and Mubin Shaikh, Undercover Jihadi: Inside the Toronto 18-Al Qaeda Inspired, Homegrown Terrorism in the West (McLean, VA: Advances Press, 2014) and Morten Storm, Tim Lister, and Paul Cruickshank, Agent Storm: My Life Inside al Qaeda and the CIA (New York City, NY: Grove Press, 2015).
[] Matthew Purdy and Lowell Bergman, “Where the Trail Led: Between Evidence and Suspicion; Unclear Danger: Inside the Lackawanna Terror Case,” The New York Times, October 12, 2003; URL: http://www.nytimes.com/2003/10/12/nyregion/where-trail-led-between-evidence-suspicion-unclear-danger-inside-lackawanna.html.
[] Patrick Poole, “Justice Department Enlists ‘Reformed’ ISIS Fighter in Risky Deradicalization Scheme,” PJ Media, March 10, 2017; URL: https://pjmedia.com/homeland-security/2017/03/10/justice-department-enlists-reformed-isis-fighter-in-risky-deradicalization-scheme/.
[] Justin Siberell, Acting Coordinator for Counterterrorism in the Bureau of Counterterrorism at the U.S. Department of State, remarks dated May 2016; URL: https://2009-2017.state.gov/j/ct/rls/rm/257726.htm.
[] All ISIS defectors were briefed regarding this project, and all went through a human subject ethical consent process. The defectors did not give their real names except for those in prison or already prosecuted, and were told not to incriminate themselves during the interviews Some individuals did not want to provide their name or give a signed consent; they only gave verbal consent. The sample was collected between November 2015 and March 2017. Interviews were conducted in a semi-structured manner, allowing the defectors to tell their stories of being recruited into the group, serving, and then defecting, followed by in-depth questioning involving a series of questions, going in-depth on topics they had personal experience with inside the group. The defectors were not asked to give their names and were judged to be genuine on the basis of four things: referral from prison authorities and prosecution records, referral from defectors who knew them from inside the group, insightful knowledge about experiences inside the group, and intense posttraumatic responses during the interview evidencing they had been present and taken part in events they were describing. The subjects were contacted via smugglers, other defectors, personal introductions, and via prison authorities. In this regard, the sample is entirely nonrandom.
[] See also Mcdowell Smith, Allison, Anne Speckhard, and Ahmet Yayla. “Beating ISIS in the Digital Space: Focus Testing ISIS Defector Counter-Narrative Videos with American College Students,” Journal of Deradicalization, Spring, no. 10 (2017): 50-76; URL: http://journals.sfu.ca/jd/index.php/jd/article/view/83/73; Anne Speckhard and Ardian Shajkovci, “Confronting an ISIS Emir: ICSVE’s Breaking the ISIS Brand Counter-Narrative Videos,” ICSVE Research Reports, May 29, 2017; URL: https://www.icsve.org/research-reports/confronting-an-isis-emir-icsves-breaking-the-isis-brand-counter-narrative-videos/.
[] Ross Frenett and Moli Dow, “One to One Online Interventions: A Pilot CVE Methodology,” Institute for Strategic Dialogue, 2016; URL: https://www.strategicdialogue.org/wp-content/uploads/2016/04/One2One_Web_v9.pdf.
[] Luke Bertram, “Terrorism, the Internet and the Social Media Advantage: Exploring How Terrorist Organizations Exploit Aspects of the Internet, Social Media, and How These Sample Platforms Could Be Used to Counter Violent Extremism,” JD Journal for Deradicalization, no. 7 (2016): p. 244; URL: http://journals.sfu.ca/jd/index.php/jd/article/view/63/58.
[[]]  United Nations Office on Drugs and Crime, “The Use of the Internet for Terrorist Purposes,” 2012, URL: https://www.unodc.org/documents/frontpage/Use_of_Internet_for_Terrorist_Purposes.pdf; Susannah R. Stern, “ Encountering Distressing Information in Online Research: A Consideration of Legal and Ethical Responsibilities.” New Media and Society 5, (2003): 249-266; Davis, et al. “Fostering Cross-Generational Dialogues About the Ethics of Online Life,” Journal of Media Literacy Education,” 2 (2010): 124–150; Annette Markham and Elizabeth Buchanan, “Ethical Decision-Making and Internet Research,” 2012, URL: http://aoir.org/reports/ethics2.pdf; The British Psychological Society, “ Ethics Guidelines for Internet-Mediated Research,”2014, URL: http://www.bps.org.uk/system/files/Public%20files/inf206-guidelines-for-internet-mediated-research.pdf
[] Anne Speckhard, Ardian Shajkovci, Lorand Bodo, and Haris Fazliu, “Bringing Down the Digital Caliphate: A Breaking the ISIS Brand Counter-Narratives with Albanian Speaking Facebook Accounts,” International Center for the Study of Violent Extremism, 2018; URL: https://www.icsve.org/research-reports/bringing-down-the-digital-caliphate-a-breaking-the-isis-brand-counter-narratives-intervention-with-albanian-speaking-facebook-accounts/