Publications

26 Jun 2019
Hate Speech and Incitement: From Der Stürmer to Facebook

Remarks delivered by Dr. Simon Adams, Executive Director of the Global Centre for the Responsibility to Protect, at the "Responsibility to Protect and Hate Speech" event hosted by the European Union and the Global Centre for the Responsibility to Protect on 26 June 2019 in New York.

I'd like to thank the European Union for co-hosting today's event with us, as well as the UN Secretary-General's Special Advisor on R2P, Karen Smith, for agreeing to speak today, and Sherine Tadros from Amnesty International for moderating.

Let's start at the beginning. If you go back to those two paragraphs of the 2005 World Summit Outcome Document where R2P was adopted, there is a very deliberate use of three words, enclosed within two discrete commas:

138. Each individual State has the responsibility to protect its populations from genocide, war crimes, ethnic cleansing and crimes against humanity. This responsibility entails the prevention of such crimes, including their incitement, through appropriate and necessary means."

The inclusion of incitement was a very conscious political decision. Hate speech and violent incitement tend to go hand in hand. And in this sense, understanding hate speech (and punishing it) has been central to attempts to understand genocide and atrocities for more than 70 years.

As far back as the Nuremburg Trials at the end of World War Two, Julius Streicher, a Nazi propagandist who was best-known as being the publisher of Der Stürmer, a popular anti-Semitic newspaper, was found guilty of crimes against humanity. The verdict against Streicher said that: "For his 25 years of speaking, writing and preaching hatred of the Jews, Streicher was widely known as 'Jew-Baiter Number One.' In his speeches and articles, week after week, month after month, he infected the German mind with the virus of anti-Semitism and incited the German people to active persecution."

Streicher was hanged in 1946. Fifty years later, during the 1994 genocide in Rwanda, the RTLM radio station played a key role promoting hate speech regarding the Tutsi, inciting the genocide, and even directing the killers with the names of potential victims who had not yet been murdered. The UN Force Commander in Rwanda, Romeo Dallaire, famously requested that the US government jam the radio signal of RTLM but he was told that it was too expensive to do so, and that it would be a violation of Rwanda's national sovereignty.

And we have seen hate speech play a crucial role in more recent atrocities committed in different parts of the world, most noticeably in relation to the Genocide against the Rohingya in Myanmar.

Across time – and despite obvious differences between all situations – there are some common elements. Fundamentally, hate speech is about weaponizing stereotypes. One recurring theme is the regularity with which the metaphoric language of "infection" or "infestation" is deployed in relation to the target group. The Nazis called Jews parasites. The Hutu genocidaire referred to Tutsi as cockroaches. Similar language is deployed in hate speech recently directed against minorities in the Middle East, against Rohingya, and so on, who are often called rats or a cancer.

This language is significant because the only way to get rid of an infestation of pests is to eliminate them. You don't kill half the bedbugs in your bed. You seek out and destroy them all and exterminate their presence in your house.
You don't operate on part of a cancer. You surgically cut it out and ensure it can't grow back. The terrible simplicity of such metaphors – when deployed against a target group – is that you don't have to spell out the solution to the problem. Simply calling people rats, cockroaches or a cancer on humanity implies an outcome that you are slowly trying to move people towards.

Another interesting thing about hate speech is that it doesn't have to work with everybody. Apparently Hermann Goering, the second in command in the Third Reich under Hitler, saw Streicher as an embarrassment and forbid his own staff from reading Der Stürmer. There were also plenty of people inside Nazi Germany who saw Hitler's anti-Semitism as foolish and irrelevant. They were more interested in his economic policies, or in making Germany great again.

The more important question – then and now – is who does the hate speech resonate with? Der Stürmer's anti-Semitism was directed at "the base" of the NSDAP (the Nazi Party) and so-called ordinary Germans who were angry about the state of their country, and the world, and wanted someone to blame. They were the ones who were prepared to accept the race laws, the marginalization of the Jews, followed by the construction of concentration camps and eventually, gradually, the Einsatzgruppen and a "final solution."

In short, Nazi hate speech was directed at mobilizing those who voted for Hitler and now hoped that he would do what he promised with the Jews – punish them. But it also contributed to a shift in the identity of Germans. Nazi hate speech said that you could not be German and a Jew. It redefined identity and excluded Jews as "the other."

These lessons from history are worth reiterating - not just because anti-Semitic tropes and hate speech by the far-right and neo-Nazis are making a disturbing comeback in Europe - but because the pattern that is discernible in the 1930s, recurs in almost every example of mass atrocities that we have experienced since then. It is true of Cambodia. It is true of Bosnia. It is true of South Sudan or Myanmar today where identities are contested, redefined and weaponized. And where hate speech has played a crucial role in turning the soil before the actual digging of the mass graves commences.

The irony is that sometimes hate speech is presented in altruistic terms. For example, when the genocide against the Rohingya was happening in late 2017 I would often get trolled by people from Myanmar who would send me horrible cartoons of stereotypical Muslim characters gang-raping Buddhist women. There would often be an adjoining message saying something like "how would you feel if this was your sister" and then some comment about how the Army was protecting people from the Rohingya who were all terrorists, rapists and land thieves. The fact that 800,000 Rohingya civilians were violently displaced, that mass rape and mass killings of innocent Rohingya took place, barely registered with such people.

And so, how should we address the problem of hate speech today? There are three central elements to this question:
1. What constitutes hate speech as opposed to just "ugly speech"? Where is the threshold for public danger?
2. What is the role of technology?
3. How should we respond?

Firstly, it seems to me that in most cases the line between ugly speech (ie: something racist someone might privately say) and actual hate speech (which incites, mobilizes and targets) is usually pretty clear. We need not get ourselves wrapped around the axle debating whether specific instances of genocide denial, or of denigrating entire categories of people on the basis of their ethnic, religious, or sexual identity, are hate speech.

Hate speech is about public targeting. There can be no such thing as free speech for any aspiring genocidaire, and for those who want to see the return of the politics of the mass grave, the concentration camp or the machete. Not in Europe, not anywhere.

We need to shut down hate speech where ever and whenever it occurs. No exceptions.

Secondly, Technology is just a tool. The Nazis used newspapers, the Hutu genocidaire used radio, and the Army in Myanmar used Facebook. The tool changes, but the political process is often similar.

Or in other words, technology is neutral, but those who use it are not. Governments and multilateral bodies need to increase the pressure on Facebook, Twitter, and on local governments and regional bodies, to actively stop hate speech from being propagated on all media platforms. The EU, for example, should be congratulated for its efforts to push for greater accountability in this regard.

In many developing countries where atrocities have recently occurred, Facebook was used to help construct a narrative of hate. But other platforms like WhatsApp were actually used to organize to act on that hate. This is one of the reasons, following the experience of the post-election violence in 2007-2008, why Kenya has put in place laws to prohibit incitement across multiple platforms. And so all this talk about algorithms and the complexity of social media cannot become an excuse for inaction. The corporate sector and the governmental sector both need to uphold their responsibilities.

Thirdly, hate speech is all about forging identities amongst both the targeted group and the group that is doing the targeting. We can disrupt that.

We need to strengthen alternative identities. We need to build national, regional and global identities that embrace difference and are only intolerant of intolerance. The antidote to hate speech is speech that promotes and protects everyone's human rights.

Thank you.