What happens when you interact with hateful content?
Over the last year and a half, a man called Rasmus Paludan has toured Denmark, promoting extreme right wing ideas against Islam and migrants. Paludan is the leader of the very immigrant-critical party "Stram Kurs", and he is a popular YouTube phenomenon among children and young people for various reasons. He tours the country visiting the so-called "socially vulnerable areas" (areas with high crime rates and social exclusion) and each visit is filmed and posted on his YouTube channel. In these videos, he refers to Islam as a "culture of losers" and he provocatively links Muslims with homosexuals, thus offending both groups.
Is it illegal?
Speaking in a derogatory manner about certain groups of people is not always a legal issue. Sometimes it is a moral issue. In this case, it is a bit of both.
YouTube once removed one of Rasmus Paludan's videos in which he was speaking derogatively to a 14-year-old boy - calling him a "criminal loser" - on the basis of YouTube's policy regarding harassment and bullying. Paludan has also been charged with defamation and has been accused of violating Danish anti-racism laws numerous times. Nevertheless, most commentators agree that Paludan exercises his right to speak publicly within the legal boundaries of freedom of speech.
The vast majority of people in Denmark - it seems - agrees that Rasmus Paludan is an extreme case with very little actual support. But the problem is not the amount of supporters he has. It is how this type of hate speech is spread among children and youngsters - and how it wrongfully normalises the dissemination of mean generalisations about certain groups.
Hate speech is being normalised and YouTube may be helping
The Danish journalist Frederik Kulager (Zetland) recently stated that only 14% of the views on Rasmus Paludan's YouTube channel are the result of people actually searching for his videos. The rest come from "next video suggestions" and the "hot videos" section with recommended material from YouTube. This calls for a better understanding of algorithms and how messages are spread online. A research carried out by Algotransparency.org last year suggests that YouTube's algorithm "systematically amplifies videos that are divisive, sensational and conspiratorial."
You may watch Paludan's videos with the intent of laughing and mocking them, but YouTube will never be able to tell the difference. Thus, it will automatically assume that you want to watch more of the same material, and it may even recommend your friends to do the same.
When teaching teenagers resilience against hate speech, it is very difficult to use concrete examples of Rasmus Paludan's videos because it can be understood as inciting or sharing hate speech. (Un)fortunately, when we have training sessions with young people, we often do not even have to share examples, because everyone already knows the videos in question. When asking pupils about Paludan, almost everyone from 10 years and up has seen numerous videos with him - both the actual videos and mocking videos, diss tracks and memes. To a certain degree, it could be argued that it is positive that children are utilising the same media tools to mock Rasmus Paludan, but the fact is that the original material is still being distributed.
In one extreme case, we saw pupils in Denmark giving a Nazi salute when we talked about Paludan in a media literacy lecture. When confronted, they argued that it was meant to be "funny and sarcastic", to demonstrate how ridiculous Rasmus Paludan is. This highlights the difficulty of determining intent. To the Jewish girl in that classroom, it may have not been funny and/or completely clear what the intention was.
Rasmus Paludan is not necessarily an example of a person violating Danish laws, but more an example of a moral defeat in terms of how we relate with certain groups of people. It shows a need to discuss hate speech with children and youngsters today to make them understand that certain characteristics from people must be protected from attacks. We must also enhance their ability to recognise and regulate their own emotions, thoughts and behaviours and take the perspective of and empathise with others, including those from diverse backgrounds and cultures.
Building upon a Social and Emotional Learning approach, the SELMA Toolkit will address these issues. It will be a set of principles, methods and activities that will enable educators and professionals to work on online hate speech with 11- to 16-year-old teenagers. Teachers will be able to navigate through 100 activities that will address a wide range of thematic questions such as:
- What is online hate speech?
- How does hate speech make me feel?
- What is my role and what can I do?
In addition, it will empower them to take action and become agents of change in their communities.