‘Antisemitism in the Digital Age: Online Antisemitic Hate, Holocaust Denial, Conspiracy Ideologies and Terrorism in Europe’

Posted Fri, 11/05/2021 - 10:38 by Robert
picture of a website

‘Antisemitism in the Digital Age: Online Antisemitic Hate, Holocaust Denial, Conspiracy Ideologies and Terrorism in Europe’

A collaborative research report by Amadeu Antonio Foundation, Expo Foundation and HOPE Not Hate, 2021. Edited by Joe Mulhall.

At the Council of Christians and Jews we are deeply concerned about all forms of antisemitism in our society. Recently, we have been troubled by antisemitic incidents targeted at Jewish communities in Manchester, Gateshead, London, and students at the University of Warwick, with abusive comments from zoom gate-crashers and antisemitic graffiti in Gateshead and North London. In addition to verbal and physical acts of antisemitism such as these, antisemitism online is an increasingly worrying trend.

The publication of ‘Antisemitism in the Digital Age’ by ‘HOPE Not Hate’ emphasises the all-too-apparent reality that antisemitism is a highly tangible, continuing issue in European society. This report, however, importantly recognises the ways in which antisemitism is changing in the digital age. By focussing on online instances of hate, it notes how antisemitism is a part of online conspiracy ideologies, how it relates to online terrorism, hate and Holocaust denial. It also observes the changing language and rhetoric of antisemitism on online platforms. ‘Antisemitism in the Digital Age’ focusses on nine major social media platforms, including the most popular sites such as Facebook, YouTube, and Twitter, in addition to other platforms with increasing reach such as Parler, Telegram and 4chan /pol/. Worryingly, antisemitism was easily found on all of these major sites. Understanding where this antisemitism is coming from, what it looks like, and the impact that it has on contemporary culture, is a task for which this vital report sets the groundwork.

Understanding the way in which social media platforms operate is an important first step in recognising the danger of antisemitism online. Social media sites are able to tailor the way users connect with content on their platform through algorithms. These algorithms are able to recognise what we engage with on the site and suggest similar content for us to view. On Facebook, this might be as simple as recommending friend suggestions based on who we have already connected with or highlighting similar groups to our interests. YouTube recommends videos to ‘play next’, and Instagram shows users similar posts to those which they have already liked. Across the platforms, many social media sites target their advertising according to what we engage with (through our searches, posts, and content). These algorithms help us to connect to things which we like and enjoy.

However, the danger with this system is that our online engagement becomes narrow; we begin to only see things which relate to what we have already engaged with. Political content which we see on these platforms, for example, may begin to reflect our own views, thus creating a sort of echo chamber. There is also a danger here, which is outlined in this report, that a person may be increasingly exposed to harmful content through these algorithms that are recommending similar groups, posts, and videos which become increasingly extreme. For example, if a person begins to watch videos online about Covid-19 conspiracy theories, they may be recommended similar videos about other fallacious ideologies. If the person follows this rabbit-warren of suggestions, they may also come across conspiracy theories relating to Holocaust denial and antisemitism, or content which encourages hateful action based on these ideologies. If a person is particularly susceptible to conspiracy theories, then this algorithm-aided progression poses a considerable danger.

The majority of social media platforms are keenly aware of the dangers this structure presents. Therefore, they moderate the posts on their sites so that harmful content is not recommended or, in some cases, is removed. In relation to antisemitism specifically, the amount of overt and extreme antisemitism on a given platform is dependent on the levels of moderation from that site. Unsurprisingly, the higher the moderation, the fewer instances of extreme antisemitism. However, the effectiveness of moderation varies between social media platforms, and there are several sites which are intentionally completely unmoderated. This report notes that on un-moderated sites, antisemitism has even progressed and escalated to incitements of violence and advocating terrorism. For example, platforms such as 4chan /pol/ have been used to pre-announce terror attacks and to share terror manifestos.

Even on moderated sites, antisemitic content can still be found relatively easily, largely because people posting antisemitic content are aware of how the moderation works and have developed tactics to circumvent it. One of the main ways people avoid their content being removed by moderators is through the use of coded language. As a lot of harmful content is identified and flagged in moderation through the use of key terms (racist or antisemitic words, for example), coded language is used to avoid moderating detection. This means that slang words are used, or asterisks are inserted into the middle of words to avoid it being flagged. Rhyming or similar sounding words are even employed in some instances (the word ‘juice’, for example, has been noted in the report as a coded reference to Jews). An extensive glossary of these coded words and terms is included in this report.

In addition to noting the broad problems associated with antisemitic content on social media platforms, ‘Antisemitism in the Digital Age’ provides in-depth observations of particular areas of concern. In this light, conspiracy theories remain one of the biggest problems in relation to antisemitism. ‘Antisemitism in the Digital Age’ notes that during the Covid-19 pandemic there has been an increase in conspiracy theories, especially around the pandemic itself and related issues such as vaccination programmes. Antisemitism can be found within some of these claims which, through the use of allusion, suggestion, and toxic insinuation, can escape moderation and permeate the discourse.

Worryingly, this report also notes the changing nature of Holocaust denial in the digital age. The report notes how Holocaust denial has converged with a culture of internet ‘trolling’ (rhetoric which is intentional provocative and controversial). In the past Holocaust deniers sought credibility by crafting detailed (but fallacious) narratives with an air of historical or scientific enquiry. In contrast, the contemporary culture of online trolling casually references Holocaust denial with the aim of shocking and spreading hate, as the report notes:

‘For much of the contemporary far right who have grown up with an internet culture that has cultivated an extreme, contrarian attitude against liberal conventions and social taboos – this event of unique and tremendous historical, moral and political significance has become a mere object of their once “trolling” ridicule and now, increasingly, hateful vitriol, and its main victims – the Jewish people – their fundamental target.’ (29).

In addition to the systematic persecution of the Jews becoming a subject for internet trolls, Holocaust denial online also appears as an historicising of the event into irrelevance for today. Deniers are still claiming it did not happen, but on social media they appear to be emphasising its irrelevance for today and a ‘why-are-we-still-talking-about-it’ attitude prevails. As the number of survivors decreases as we move further away from the historical event of the Holocaust, the growing trivialisation of the Holocaust is a particular concern for younger generations who are engaging with such online antisemitic content.

At the Council of Christians and Jews, we are particularly concerned by Christian involvement in antisemitism. Whilst this is not a primary focus of this report, instances of Christian antisemitism are noted. What the authors describe as ‘radical Christian anti-Jewish rhetoric’ was noted on the platform Parler, as well as radical Christian antisemitism on YouTube. Additionally, the detailed glossary of coded language and antisemitic slang terms within the report draws attention to particular terms and anti-Semitic labels which have Christian roots. This is perhaps a specific area where more research, and indeed education and correction, can take place.

This report should not be needed. Antisemitism should not be an issue in our society but, sadly, it continues to permeate our culture. The nature of antisemitic rhetoric, as this report notes, is changing. Its presence on social media platforms has resulted in new expressions of hatred, a new language of coded messages is evolving and a trivialising, trolling attitude towards antisemitism is taking root. It is vital that this online antisemitism is noted, and steps are taken to ensure that its harmful influence is combatted. This report from HOPE Not Hate is a vital step in both recognising antisemitism in the digital age and working towards its eradication.

Back to news