Twitter and Antisemitism

On October 10, 2012 hundreds of Twitter messages appeared in France consisting of antisemitic remarks reminiscent of Nazi propaganda.  They were accompanied by photographs of concentration camp victims together with unpleasant facetious references to the Holocaust. These messages were posted on hashtag #unbonjuif (a good Jew). This cyberspace exchange was recorded as the third largest tagged subject on the French Twitter site. The term was sardonically chosen to insinuate that "a good Jew is a dead Jew". To make the point even more forcefully a picture of an emaciated Jewish woman who had been in a Nazi concentration camp was posted with the term, "A good Jew."

French organizations, primarily the student group L' Union des etudiants juifs de France (UEJF), SOS Racisme, and the Conseil Representatif des Institutions juive de France ( CRIF), the official body representing French Jewish communities, were  alarmed and concerned with the volume of hatred in these expressions of free speech, and denounced the "wave of feverish hatred."

The use of cyberspace and social networking for political purposes has become familiar. A major factor in the success of Barack Obama in 2008 was his use of social media to obtain funds and an army of volunteers. Facebook and Twitter were significant factors in persuading two million people in January 2011 to demonstrate and effectuate the peaceful revolution in Tahrir Square in Cairo that ended the Mubarak regime.

The extent of the antisemitic messages exchanged in France is not only disturbing in itself but also exemplifies existing problems in the use of cyberspace and social networking generally. The France Twitter site was an opportunity to make rancorous pronouncements anonymously. The use of social networking represents a new collective phenomenon, the ability to express hate without fear of retribution. There appeared to be no official or unofficial body capable or willing to monitor loathsome expressions of this kind.

An immediate problem is whether Twitter is to be regarded as a private company itself responsible for the contents of its website or whether it can be held responsible by official organizations for monitoring the contents because it occupies public space? Twitter has always promoted the claim that it did not mediate content. The general problem is whether the law of a country is able to catch up with technological developments, especially in non-moderated discussion forums?

In its starkest form the dilemma is whether hate speech could be considered free speech, or whether its expression is not only undesirable and morally offensive, but likely to lead to illegal conduct and therefore ought to be stopped. Has the social network become a troublesome problem particularly for young people who are the main users of it and may be using it as a game for fun or for irresponsible exchanges?

The French groups that protested against the uncontrolled abusive messages on Twitter suggested that the transmission of hateful, racist, or antisemitic remarks should not be made available to everyone using the social network. There are legal as well as ethical issues involved in the question as to whether Twitter should be held responsible for what was transmitted especially if it can be considered as hate speech? Legal restrictions against hate speech directed at minority groups, especially Jews, already have imposed in a number of European countries in reaction to Nazi propaganda and the Holocaust.

At first Twitter, a global corporation, refused to delete the tweets, arguing that control of this kind must emanate from an official national source. Under pressure, the company in France changed its mind and agreed to erase the tweets that were obviously offensive. Twitter executives recognized that the company should comply with the local laws in the different countries in which it operated. Accounts could therefore be blocked in individual countries if the content violates local law. Yet Twitter's decision to suspend or eliminate improper tweets was viewed by some as an attack on freedom of expression.

Twitter has also acted in two other cases.  In Germany the account of a neo-Nazi group called Better Hanover was blocked because of its tweets involving hate speech and physical threats to opposing politicians.  Twitter also stopped the account of Nick Griffin, a British member of the European Parliament and chairman of the extreme right wing group, the British National Party. His tweets were directed against ending discrimination against gays.

Only after the French Jewish student movement threatened legal action did Twitter agree to remove the offensive antisemitic tweets. A further and more problematic issue was whether Twitter had to reveal the identities of those who had anonymously used the antisemitic hashtag. The company refused to do so, but the French court in January 2013 ruled it must reveal them. Twitter will now have to make public the identity of the tweeters in France. Critics of the court's ruling argue that this decision will prevent expression by those who want for one reason or another to remain anonymous and fear exposure.

The controversial and difficult problem of the limits of free speech in this era particularly in relation to antisemitism, is partly cultural as well as legal. In the United States the First Amendment of the Constitution is likely to provide protection for most speech regarded as offensive to groups in the country. In European countries the memory of and belated response to Nazi behavior has led to more restriction on certain forms of speech directed against minorities.

The reality must be faced. Twitter, and the general social network, Facebook, YouTube and others, though only about ten years old, are now the main vehicles for expressing hatred against others and perhaps inducing violence. The law must catch up with this usage. It may be that law in France as elsewhere is not yet completely clear as to whether the social network should be considered private or public, but it is clear that the harm done by hate speech and expressions of antisemitism affects the whole social order. It is no denigration of the value and importance of free speech in democratic societies to suggest that the transmission of messages of hatred need some form of monitoring.

On October 10, 2012 hundreds of Twitter messages appeared in France consisting of antisemitic remarks reminiscent of Nazi propaganda.  They were accompanied by photographs of concentration camp victims together with unpleasant facetious references to the Holocaust. These messages were posted on hashtag #unbonjuif (a good Jew). This cyberspace exchange was recorded as the third largest tagged subject on the French Twitter site. The term was sardonically chosen to insinuate that "a good Jew is a dead Jew". To make the point even more forcefully a picture of an emaciated Jewish woman who had been in a Nazi concentration camp was posted with the term, "A good Jew."

French organizations, primarily the student group L' Union des etudiants juifs de France (UEJF), SOS Racisme, and the Conseil Representatif des Institutions juive de France ( CRIF), the official body representing French Jewish communities, were  alarmed and concerned with the volume of hatred in these expressions of free speech, and denounced the "wave of feverish hatred."

The use of cyberspace and social networking for political purposes has become familiar. A major factor in the success of Barack Obama in 2008 was his use of social media to obtain funds and an army of volunteers. Facebook and Twitter were significant factors in persuading two million people in January 2011 to demonstrate and effectuate the peaceful revolution in Tahrir Square in Cairo that ended the Mubarak regime.

The extent of the antisemitic messages exchanged in France is not only disturbing in itself but also exemplifies existing problems in the use of cyberspace and social networking generally. The France Twitter site was an opportunity to make rancorous pronouncements anonymously. The use of social networking represents a new collective phenomenon, the ability to express hate without fear of retribution. There appeared to be no official or unofficial body capable or willing to monitor loathsome expressions of this kind.

An immediate problem is whether Twitter is to be regarded as a private company itself responsible for the contents of its website or whether it can be held responsible by official organizations for monitoring the contents because it occupies public space? Twitter has always promoted the claim that it did not mediate content. The general problem is whether the law of a country is able to catch up with technological developments, especially in non-moderated discussion forums?

In its starkest form the dilemma is whether hate speech could be considered free speech, or whether its expression is not only undesirable and morally offensive, but likely to lead to illegal conduct and therefore ought to be stopped. Has the social network become a troublesome problem particularly for young people who are the main users of it and may be using it as a game for fun or for irresponsible exchanges?

The French groups that protested against the uncontrolled abusive messages on Twitter suggested that the transmission of hateful, racist, or antisemitic remarks should not be made available to everyone using the social network. There are legal as well as ethical issues involved in the question as to whether Twitter should be held responsible for what was transmitted especially if it can be considered as hate speech? Legal restrictions against hate speech directed at minority groups, especially Jews, already have imposed in a number of European countries in reaction to Nazi propaganda and the Holocaust.

At first Twitter, a global corporation, refused to delete the tweets, arguing that control of this kind must emanate from an official national source. Under pressure, the company in France changed its mind and agreed to erase the tweets that were obviously offensive. Twitter executives recognized that the company should comply with the local laws in the different countries in which it operated. Accounts could therefore be blocked in individual countries if the content violates local law. Yet Twitter's decision to suspend or eliminate improper tweets was viewed by some as an attack on freedom of expression.

Twitter has also acted in two other cases.  In Germany the account of a neo-Nazi group called Better Hanover was blocked because of its tweets involving hate speech and physical threats to opposing politicians.  Twitter also stopped the account of Nick Griffin, a British member of the European Parliament and chairman of the extreme right wing group, the British National Party. His tweets were directed against ending discrimination against gays.

Only after the French Jewish student movement threatened legal action did Twitter agree to remove the offensive antisemitic tweets. A further and more problematic issue was whether Twitter had to reveal the identities of those who had anonymously used the antisemitic hashtag. The company refused to do so, but the French court in January 2013 ruled it must reveal them. Twitter will now have to make public the identity of the tweeters in France. Critics of the court's ruling argue that this decision will prevent expression by those who want for one reason or another to remain anonymous and fear exposure.

The controversial and difficult problem of the limits of free speech in this era particularly in relation to antisemitism, is partly cultural as well as legal. In the United States the First Amendment of the Constitution is likely to provide protection for most speech regarded as offensive to groups in the country. In European countries the memory of and belated response to Nazi behavior has led to more restriction on certain forms of speech directed against minorities.

The reality must be faced. Twitter, and the general social network, Facebook, YouTube and others, though only about ten years old, are now the main vehicles for expressing hatred against others and perhaps inducing violence. The law must catch up with this usage. It may be that law in France as elsewhere is not yet completely clear as to whether the social network should be considered private or public, but it is clear that the harm done by hate speech and expressions of antisemitism affects the whole social order. It is no denigration of the value and importance of free speech in democratic societies to suggest that the transmission of messages of hatred need some form of monitoring.

RECENT VIDEOS