Defence & Security

Terrorist Exploitation of Social Media Platforms

Published on
February 20, 2024

Since the advent of the internet and social media (SoMe) platforms like Facebook, an increasing plethora of internet based interactive technologies allow the creation of virtual communities permitting the expression and sharing of ideas (Kietzmann & Hermkens, 2011). Creating and sharing of content produces ‘User Generated Content’ (UGC) (Obar & Wildman, 2015), starting as simple chatgroups and over time increasing its ability to upload visual content and videos. We now have a wide variety of SoMe platforms such as Facebook, X (formerly known as Twitter), Telegram, Signal, Instagram, Snapchat, TickTock, and Gaming Platforms.

UGC has transformed Journalism from the control of a select few broadcasting agencies to the hands of anyone with a smartphone and SoMe account. Usage of SoMe continually increases globally, and in 2023 it reached 4.9 billion users (Bottorff & Wong, 2023). Youth are found to be the most active users with 84% of SoMe users in the US aged 18-29 (Newberry, 2023). Whilst these mediums have provided society with many benefits – such as bridging connections between communities, improving learning and reducing loneliness – as with most technological advances they also aid more nefarious actors.

Despite the term terrorism being widely used, there remains to be an internationally agreed definition of terrorism. To date, the debate has produced hundreds of definitions and the discussion continues (Ganor, 2002; Schmid, 2011). A common and international accepted definition of terrorism is almost impossible to achieve, mainly due to the phenomenon that ‘One man’s terrorist is another man’s freedom fighter’ (Laqueur, 1987).

Further debate arises as to whether State Actors also fall under this definition. However, in accordance with Ganor (2002) and Schmid (2011) State Actors are bound by International Laws, such as The Geneva and Hague Conventions, and other terms apply such as ‘war crimes’. Hence defining terrorism as ‘Non-State actors using violence against civilians to incite fear among a target audience in order to influence far-more distant actors to achieve a political goal’ (Taylor, 1988; Schmid, 2011). Despite the definition excluding State Actors from actively engaging in terrorism, it does not negate the possible involvement of Nation States in sponsoring terrorism known as ‘state-sponsored terrorism’, such as Iran with Hezbollah and Hamas (Schmid, 2023).

Unprecedented broadcasting potential

Broadcasting is key for terrorist groups; without an ability to convey the group’s agenda to a wider audience – in order to gain recognition or attention – it would struggle to instil fear and promote its message (Crenshaw, 2019; White, 2020; Marsden, 2012). Not surprisingly, every advancement in communication has aided terrorism over time, from the printing press, to radio, and now SoMe (Rapoport, 2001). Rappaport’s Four Wave theory attributes the growth of terrorism in the 19th Century to advances in communication, with globalisation and the further evolution of communication gravely facilitating the Fourth Wave of terrorism and aiding the spread of terrorism internationally (Rapoport, 2001).

SoMe’s global reach and UGC further enhances the broadcasting capabilities of terrorists. Technological advances now provide the ability to live-stream directly to audiences worldwide, as was seen in the Christchurch shootings in 2019 (Leitch & Pickering, 2022), and even more recently with the October 7th Hamas attacks on Israel (Baja, 2023).

These events rapidly transition from SoMe platforms to traditional media, further publicising the events and increasing sentiments of fear and terror. Although traditional news outlets such as BBC, CNN, Sky News etc. broadcast content derived from SoMe feeds, they are compounded by Ethics and Codes (BBC, 2022) for less biased reporting and moderation of graphic material in an attempt to reduce the negative effects of reporting (White, 2020). Such regulations do not apply on SoMe, resulting in the proliferation of highly graphic, highly emotive and bias content.

The psychology of terrorism

Research into the psychology of terrorists demonstrates that rather than being destabilised individuals with mental conditions, terrorists appear as normal, rational individuals, with decision-making based on ‘Rational Thought’ (Victoroff, 2005; Horgan, 2017; Silke, 1998). Thus signifying, they consider their acts as rational according to their existing cognitions and thoughts; a belief system is acquired and reinforced by their social environment (Taylor & Horgan, 2006). Radicalisation is the term used to describe the process in which one gets involved in terrorism and has stark similarities to that of cults and sects (Stark & Bainbridge, 1980).

It is a gradual, complex and intertwining process influenced by social contexts (Horgan, 2017; Schmid, 2011) with a gradual increase in experiences, extremist content and ultimately commitment to the group (Marsden, 2012) and is viewed as a layered process referred to as a ‘staircase’ to terrorism (Moghaddam, 2005). The ‘Two-Pyramids Model’ (MacCauley & Moskalenko, 2017) applies to individuals who experience a certain grievance from their social environment – whether actual or perceived – and are subsequently exposed to extremist ideology which is then reinforced by members of their community (Youngblood, 2020).

Social conformity, one’s innate desire to conform and copy actions of others to align oneself with the norms, has proven to be a highly influential factor within society (Prade & Saroglou, 2023). Research demonstrates that social conformity is a powerful influencing factor in the process of social learning of deviant behaviour in adolescence (Hundeide, 2003). Informal social learning environments, known as ‘Communities of Practice’ permit members to discuss topics of interest, share knowledge, and creating an environment of collective learning. Aligning goals, ideologies, and social conduct of the group and its members, therefore resulting in a “community created over time” (Wenger, 1988) and leading to the creation of ideologies which are mental frameworks of thoughts and ideas about the world that aids the understanding of the social environment and how to manoeuvre within it (Hall, 1985).

SoMe are virtual replicas of ‘community practice’ providing individuals with the ability to share, discuss, comment, like and dislike information, mimicking informal social spaces of collective learning and reinforcement of ideals (Youngblood, 2020; Chiovaro, et al., 2021; Prade & Saroglou, 2023). The global reach of the internet signifies that these communities are no longer constrained by geographical boundaries.

These decentralised networks of connections have also aided the contagion effect of social and political ideas (Guilbeault, et al., 2018; Chiovaro, et al., 2021), as could be seen by the #MeToo and #BlackLivesMatter movements, as well as the Arab Spring events 2010 (Chiovaro, et al., 2021). To take effect, contagions require multiple exposures and are highly influenced by peers and social environment (Guilbeault, et al., 2018). The contagion effect within SoMe has also influenced more sinister movements, such as Facebook’s (now Meta) influence in the genocide of the Rohingya people in Myanmar (Amnesty International, 2022). The platform was utilised to disseminate hate speech, fake news and incitement of violence against the Rohingya people (Bundtsen, 2022; Milmo, 2021). Although SoMe was not the main cause for this genocide, the contagion effect of hateful material proved to be an extremely influencing factor.

Radicalisation process

Research supports the notion that SoMe has influenced the radicalisation process as terrorists use these platforms for social networking, recruiting, sourcing, and sharing information, and it therefore acts to reinforce ideals (Gill, et al., 2017; Youngblood, 2020). This suggests that social media platforms serve to reinforce existing beliefs (Shaban, 2020). However, radicalisation rarely occurs solely online and is usually supported with physical meetings (Taylor & Horgan, 2006; Gill, et al., 2017). Youngblood (2020) explains how radicalisation and extremist ideologies spread like complex social contagions requiring multiple exposures, where both SoMe usage and group membership enhance that process. Yet, cases of isolated ‘self-radicalisation’ have emerged, as with the Chourdry Case (Pearson, 2016).

Contagion effects on SoMe platforms are enhanced by user-based algorithms providing personalised content to users based on previous usage and streaming (Bundtsen, 2022). This produces ‘echo chambers’ where users are constantly provided similar content, and counter arguments are hardly seen, thus further alienating opposing views and creating silos that further reinforce the beliefs and biases, increasing intolerance, hatred and ultimately enhancing the ideology (Youngblood, 2020).

Some SoMe communities further enhance the echo-chamber effect by silencing opposing views, curtailing freedom of speech (Leitch & Pickering, 2022). Engagement-based algorithms suggest content based on most views (Bundtsen, 2022), and research has shown that engagement based ranking promotes borderline and harmful content (Bundtsen, 2022; Leitch & Pickering, 2022). Studies confirm that users engage more (share, like etc.) with borderline material (Bundtsen, 2022; Leitch & Pickering, 2022) affirming that engagement-based algorithms promote borderline hateful material. Social proof suggests the more a message is repeated the more it gains truth (Cialdini, 1993), meaning when people see many similar posts, shares and likes, the views are validated and users are influenced.

Unfortunately, SoMe platforms have not fully disclosed the way in which these algorithms work, and researchers are still studying their effects (Whittaker, 2022; Bundtsen, 2022; Gonzalez-Bailon, et al., 2014). However, it is undoubted that they have had an influence on shaping and reinforcing biases and cognitions of users, therefore aiding radicalisation (White, 2020; Youngblood, 2020; Whittaker, 2022). With youth being the most prolific users of SoMe, the effects of echo chambers are likely to be amplified as they are still forming their identity are highly susceptible to influence from peers, and often not yet well versed in critical thinking. However, more research is required to determine the effects of SoMe and algorithms on demographics (Evans & Williams, 2022).

Aiding terrorist recruitment

Recruitment is a key element to terrorism; a myriad of Terrorist organisations exists with wide variety of political causes, social constructs and norms as well as their members. Roles within the terrorist groups are equally diverse, from fund raising, recruitment, logistics, administration, planning, as well as actual involvement in violent acts. As with the variety of roles found within corporate organisations, recruitment within terrorist groups mimic those of corporations (Hunter, et al., 2017). Far from being imbalanced and poorly educated, terrorists are often educated and intelligent individuals (Horgan, 2017; Schmid, 2023) with varying skills and personality traits for differing roles, such as between leaders and suicide bombers (Victoroff, 2005).

SoMe has proven to be a useful tool for terrorists to spread their ideas in order to proliferate their propaganda at a global scale providing a wider platform for radicalisation and recruitment, thereby attracting new members, and further enhancing their objectives (Youngblood, 2020; Gill, et al., 2017). Terrorists use tailored propaganda to further enhance the effectiveness of SoMe campaigns (White, 2020; Bundtsen, 2022), for example ISIS proved to be extremely skilled at tailoring and creating targeted recruitment propaganda material for various audiences to recruit foreign fighters (Pearson, 2016; Shaban, 2020), much like marketing campaigns (Hunter, et al., 2017).

Lone actors defy traditional research into the psychology of terrorists, displaying higher rates of personality disorders and often being marginalised from their society. These are inspired individuals affiliated with a particular terrorist group but not officially engaged in the construct of the organisation, who take action on their own (Horgan, 2017), consume multitudes of extremist content, and interact with extremist communities online, highly influencing radicalisation (Youngblood, 2020; White, 2020; Gill, et al., 2017; Leitch & Pickering, 2022). Terrorist groups use SoMe to motivate these marginal members with propaganda material and actively encourage these types of attacks (Whittaker, 2022), as with ISIS propaganda (Shaban, 2020; Pearson, 2016).

This proves to be an effective means of attack due its spontaneity and requiring no direct involvement in planning, organising, funding. It appears that terrorist groups such as ISIS and Hamas have adopted this effective strategy by calling for uprisings in other locations and making use of these marginalised individuals who were not usually accepted into the construct of the group. Studies demonstrate that marginal individuals (perceived by the group to not fully exemplify the group’s attributes) are more likely to perform extreme acts in order to prove their commitment and beliefs (Hogg, 2021). Marginalisation may be the cause for the extreme violent acts committed by lone actors, and possibly the reason foreign fighters tend to be the most extreme, such as Jihadi John & Sally Jones within ISIS.

Lacklustre content moderation

Despite content moderation attempts by SoMe giants, such as Meta (Facebook), Google, YouTube, etc, to prevent the upload or to remove violent content aiming to halt its proliferation, their methods are still being refined and are not always accurate or timely (Bundtsen, 2022; Leitch & Pickering, 2022; Whittaker, 2022; Gilbert, 2023).

Additionally, the emergence of alternative SoMe platforms with less stringent regulations, such as Telegram, bypass content restrictions and therefore permitting dissemination (Evans & Williams, 2022; Leitch & Pickering, 2022; Whittaker, 2022), making these platforms extremely popular with terrorists and other nefarious actors (Littell & Starck, 2023; White, 2020; Leitch & Pickering, 2022). Once footage is posted online, it can be downloaded and circulated on other SoMe chat platforms such as WhatsApp, Signal, Telegram etc. meaning content can continue to be shared even if the original post has been taken down. Terrorist groups are also adapting content to be borderline and so less likely to be blocked or taken down (Whittaker, 2022; Evans & Williams, 2022).

The current Hamas-Israel conflict further demonstrates how terrorist groups strategically employ SoMe within their strategies, and how this can attain significant gains. On Oct 7th Hamas employed the broadcasting ability of live video feed, and utilised the lack of content moderation on Telegram, to stream their attacks live (Cortellessa, 2023). In addition, they further exploited SoMe by filming their victims and posting the footage on the victims SoMe accounts (Frenkel & Minsberg, 2023) thereby reaching a targeted audience, further multiplying broadcasting channels, and dramatically augmenting fear within the population and abroad. Simultaneously, groups and countries against the state of Israel were celebrating the events (Cortesellessa & Bergengruen, 2023). Hours of terror resulted in the murder of 1,400 Israelis including civilians, women, children and babies, perpetrated with horrific acts of mutilation and rape (Gettleman, et al., 2023; Cortellessa, 2023) and over 200 hostages taken.

Given past harsh responses from Israel on attacks, with mass bombings such as in 2014 (UNRWA, n.d.), such an act would no doubt initiate a brutal response, and one can only assume that Hamas would have expected this. The Israeli Defence Force (IDF) have long claimed that Hamas uses its civilians as human shields by placing military targets in hospitals and schools, as when those are targeted by Israel the subsequent deluge of graphic images are employed to accuse Israel of ‘war crimes’.

Current footage from the IDF appear to support this notion (TBN Israel, 2024). A year prior to the Oct 7th attacks, Hamas created some 40,000 online accounts that remained inactive until the day of the attack when they commenced the proliferation of propaganda material (Cortellessa, 2023). This suggest pre-planning of the use of SoMe platforms to shape the information environment. It has also found that 25% of the online conversations emanated from fake accounts (Cortellessa, 2023), and accounts linked to China, Russia and Iran have also amplified to the deluge of posts by sharing and adding content including anti-western propaganda (Linvill & Warren, 2023; Cortesellessa & Bergengruen, 2023; Li, 2023).

Misinformation

UGC lacks the ethic codes and training of journalists as with traditional media thus SoMe is more vulnerable to the use and spread of misinformation and deepfakes. As with other conflicts, misinformation is rampant, images from other wars, such as bombings in Syria, incorrect subtitles and Artificial Intelligence (AI) generated material is abundant in the pro-Palestinian propaganda (Associated Press, 2023; Roscoe, 2023; Cortesellessa & Bergengruen, 2023).

Videos that are often short, graphic and highly emotional further engender feelings of grievances within viewers (Li, 2023; Gill, et al., 2017). Despite attempts to fact check, traditional media companies already struggling to keep up with the pace of global news cycles (White, 2020) and now further compounded by UGC on SoMe, occasionally air unsubstantiated information from SM further validating the misinformation. As with the reporting of the explosion at the Al-Ahli hospital carpark (Herman, 2023) when the BBC and other channels rushed to publish the report from Hamas’ Telegram channel accusing Israel, this resulted in grave political consequences and prevented a meeting with the US Secretary of State, Antony Blinken (Sullivan, et al., 2023). Analysis from Israel and international sources support the notion it was a failed rocket emanating from Gaza (Human Rights Watch, 2023).

Online fact checking

Online fact checking applications have only recently become available and the onus relies on the individual. The plethora of information on our devices and multitude of by SoMe platforms, along with busy and complex lives, provide limited time for fact checking. These conditions, enhanced by emotions, trigger the peripheral route for information processing which reiles on cognitive biases and simple cues, and information is less scrutinised for its merits (Shi, et al., 2018). The rise of misinformation, AI and deepfakes is a general concern to security forces across the West and NATO as this has been a tactic used by Russia and China to promote discord within western societies (Linvill & Warren, 2023; Littell & Starck, 2023; Li, 2023; Green, 2023).

The deluge of graphic imagery, enhanced by misinformation and deepfakes, and amplified by fake accounts and echo chamber effects has created a contagion of pro-Palestine protests globally, with shared slogans (Li, 2023), and supporting the notion that negative impact of crackdown and subjugation motivates moderates to support extremist groups (Marsden, 2012). The protests have applied pressure on leaders of Western governments, usually aligned with Israel, to advise restraint (Cortellessa, 2023), and societal and political pressure on Israel ultimately aids Hamas.

Despite having committed an atrocious terrorist attack with heinous crimes, with the aid of SoMe Hamas has managed to enhance support by invoking the emotions of the public. It would appear that although Israel may be winning the military war, they could be losing the Information War.

Planning targets via social media

Research demonstrates SoMe platforms have been useful tools for sources of information on targets and planning attacks (Gill, et al., 2017; Youngblood, 2020). Online shooter gaming platforms have been thought to be highly influential to the live streaming of event of the Christchurch 2019 shooting, as the first-person perspective of the shooter strongly resembled that of online shooting games (Evans & Williams, 2022; Leitch & Pickering, 2022). The nature of the resemblance of the material to actual gaming provides an additional concern as they are sometimes able to bypass the automated content moderation algorithms on SoMe platforms (Leitch & Pickering, 2022; Evans & Williams, 2022).

New emerging SoMe Platforms enhanced by AI are providing additional features such as geolocating photos (Ortiz, 2023) and facial recognition (Mohanakrishnan, 2021). No doubt, these will prove to be valuable tools for terrorist in future. The forthcoming metaverse will allow people to submerse themselves into museums and prominent cities; Genoa has already produced a metaverse (ETT Solutions, 2022) and Monaco is currently producing its own (Monte-Carlo Societe Des Bains de Mer, 2023). The ability to virtually tour the building, walk the streets and enter shops, with exact replica of the real location, will permit terrorists to gain valuable information, plan and even train for terrorist attacks without the necessity of physically being at the location.

Impact on radicalisation

SoMe aids terrorism in a number of ways, from amplification of broadcasting abilities globally to propagating violent and extremist content. Due to radicalisation being dependant on the social reinforcement of ideals, it is no surprise that SoMe has had an effect on radicalisation allowing for multiple exposures and social reinforcement of extremist content. SoMe has become an important staging ground for the dissemination of propaganda for recruitment and reinforcement of ideals.

Inciting lone actors through SoMe has also proved useful to terrorist groups as coordination and funding no longer requires to be centralised, making attacks spontaneous and difficult to detect and increasing fear. Although not the sole factor, it has indisputably proven to be a reinforcing factor for radicalisation and recruitment. Although research suggests radicalisation rarely happens online alone, some cases have occurred, adding to concerns. Meanwhile, virtual communities of practice and sharing information can enhance the radicalisation process by providing multiple avenues for reinforcement of ideals.

Anonymity encourages

The anonymity of the web permits the creation of Bots and fake accounts, exploiting SoMe algorithms and ‘echo-chambers’ to disseminate information at a rapid and global scale, thereby enhancing the powers of social persuasion on a target audience (tweets, re-tweets, Social Proof etc.). This tactic has been used by malign states, such as Russia and China, to shape public opinion (Green, 2023; Li, 2023; Linvill & Warren, 2023; Littell & Starck, 2023), and is known as Information Operations.

As seen in the recent 7th Oct Hamas attack it appears terrorist groups adopted this strategy and it had profound effects. Malign states have further enhanced the propagation of propaganda material through their virtual army of bots, fake accounts, and influencers, furthering their aims of inciting discord and polarisation within our societies. Despite efforts by SoMe giants (Meta, Google, X etc.) giants to restricting this content, making it harder for terrorist groups to set up accounts and transmit information, terrorists adapt by producing borderline content, utilise alternative platforms with less stringent regulation, or even private chat groups and the dark web, rendering it harder to monitor.

Improvements are still required in order to capture affiliated accounts, recognising extremist content, or even borderline material. With increasing access to, and improvement of AI it will be imperative to find mitigating measures, and the software industry could aid with combating the influence of deepfakes and misinformation by improving the speed, accuracy and availability of fact checking programs. If there was a way to be able to provide provenance of content from cradle to grave, providing visibility of the data and any amendments made along the way, with a user-friendly interface, then that may go some way towards users regaining trust in the information they are viewing and reducing the spread of misinformation.

Political influence

It appears that terrorism – as with modern warfare – has very much moved into the information space using SoMe to influence political agendas and radicalise, recruit and mobilise individuals globally. Conflict is not only on the battlefield but also in people's minds. SoMe has been an important staging ground for nefarious actors to spread disinformation of extremist content, fake news and misinformation in order to polarise societies. The advent of AI and deepfakes will make it harder to discern authenticity, further compounding the effects of influence.

The complex nature of terrorism and that of SoMe result in the influencing factors being one of the most complex, on multiple levels, which researchers continue to study (Bundtsen, 2022; Evans & Williams, 2022). Algorithms are propriety information, so restricting access outside researchers may have to be necessary. This, along with the immensity of data available contributes to varying standards of research design and unreliable data (Gonzalez-Bailon, et al., 2014) so implementing auditing standards would facilitate the analysis of findings, and researchers should consider this bias when conducting research (Gonzalez-Bailon, et al., 2014). Better understanding of the way in which different algorithms work and how they vary across SoMe companies may provide insight into how they might be amended to provide more balanced content to users to avoid the echo-chamber effect.

Conclusion

Constant technological advances and emergence of new SoMe platforms combined with actor innovation of terrorist groups suggests that eradication of the terrorist propaganda on SoMe will be almost impossible. Finding alternatives to combating the polarising effects of SoMe is essential; students would benefit in receiving age-appropriate lessons on critical thinking when navigating the internet and SoMe, training them to recognise or check for misinformation and deepfakes, informing them of the polarising effects SoMe and ways to avoid it.

By building resilience, we can hope to diminish the effects of nefarious manipulation and influence of SoMe content. Children are accessing phones and devices at much younger ages, including SoMe platforms themselves, so it is imperative that critical thinking along with internet and identity safety is taught. In addition to broadcasting, radicalisation and recruitment advantages, there exists an important information gathering and learning element.

With the advent of further technological developments in SoMe, platforms such as the metaverse will only amplify the ability to learn and train using SoMe, therefore further enhancing the utility of SoMe to terrorists. SoMe has proven to be a useful tool in multiple domains and just as technology advancements throughout history have aided terrorism, so those of the future will continue to do so, constantly changing the dynamics of terrorism activity and thus terrorism counter measures.

Written by
Mercedes Le Carpentier
Mercedes is Director of Exsel Group & Explora Security and a Research Fellow and Trustee of the Explora Foundation. For 20 years, Mercedes has been at forefront of research that affects how soldiers operate in the field, focused on situational awareness, cyber security and physical protection. Mercedes specialises in psychology and in-depth understanding of cultures, and provides unique lectures to NATO on psychological underpinnings of communication in multi-cultural contexts. Mercedes is currently pursuing a Masters in Terrorism and Counter Terrorism at Royal Holloway University of London.
Read more
Subscribe to Karve's quarterly roundup newsletter

Including market trend insights, company updates and info on innovation funding streams, growth strategies and other helpful scale-up tactics for your organisation.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Share this post