Complexity of criminal speech that can spread hatred



Commentary

Speeches in Canada have long been subject to reasonable restrictions. It is illegal to incite violence against individuals or groups. You cannot encourage others to commit criminal activity. Nor can you scream “fire” in a crowded theater, as in the old cliché. People can be held criminally liable if the path from speech to direct harm is clearly established. Things get complicated when you try to criminalize speech that can spread hatred.

Near the end of the spring parliamentary session, the Liberal Party government introduced bill C-36. The purpose of this bill is to amend both criminal and human rights laws to include online hate speech provisions. The Liberal Party has announced that it wants to regulate online speech through C-10 and C-36, which are currently in front of the Senate.

Federal elections are likely to take place this fall, and if so, these bills will die on orders, but that doesn’t mean there will be no such legislative efforts to control speech. That means they will turn into campaign planks.

Defending freedom of speech can be a dangerous business, especially during elections. Proponents of unbound speech are often unfairly accused of supporting the transmission of hatred itself. Defending the right of a person to say offensive is not the same as supporting what was actually said, but when political brinkmanship is involved, the boundaries are often blurred. I will. This puts supporters of freedom of speech on the defensive and makes them hesitant to even get into trouble. It makes the election period as burdensome as speech regulation and the worst time to get into a subtle issue.

Bill C-36 is essentially a reincarnation of what was once Article 13 of the Canadian Human Rights Act. Section 13 used the term “likely to expose one or more people to hatred or contempt” when referring to malicious expressions. The ambiguity and subjectivity of the definition has led to numerous proceedings against publications, including journalists and McLean magazines, by publishing content that some people disliked. All accusations against McLean’s were eventually dismissed, but it was costly for magazines and had a chilling effect on journalists and publications.

Article 13 of the Human Rights Act was abolished by the Harper government in 2008. It turns out that the definition of hatred is too subjective to apply to the limitation of speech. Accusations were common, but convictions were rare. The legislation could not be supported. Bill C-36 uses the term “which is likely to encourage aversion or accusation of an individual or group of individuals based on the reasons for prohibited discrimination.” If anything, C-36 has a more ambiguous definition of hatred than Section 13. It can certainly have a chilling effect on speech, but in reality it is no more enforceable. The use of the term “probably” in the law creates holes that reasonable judges cannot ignore.

When people cross the line from offensive content to harassment, we already have the legal means to deal with them. In May 2019, controversial former Mayor of Calgary candidate Kevin Johnston was ordered by a civil court to pay $ 2.5 million for a campaign to harass a Muslim man in Ontario. Johnston was subsequently imprisoned for posting an online threat to medical service workers in Alberta. No new law was needed to deal with Johnston, nor was it needed when someone like him surfaced online.

Modern society is constantly evolving. Public expressions of hatred of people for race, sexual orientation, or religion are socially unacceptable. Those who stick to prejudice against identifiable groups will soon find themselves banished among their peers. Social pressure is far more effective than legal threats in the fight against hatred. Yes, there are still people with voice and hatred, and probably some at all times. But such people are becoming a minority, not because they are afraid of legal implications.

There are many malicious and offensive statements on the Internet. There has never been a feature that anonymously exposes vulgar content to a potentially large number of viewers. Many nasty people use online platforms to spread hatred to groups and people. But we should try to regulate this, and if we wanted, could we?

Regulation of hate speech, whether in parliament or campaigns, will be discussed this fall. Try to keep the discussion on this issue thoughtful and rational, as long as it can be difficult. Theater and political attitude cannot influence anything as critical as freedom of speech.

Cory Morgan is a Calgary-based columnist and business owner.

The views expressed in this article are those of the author and do not necessarily reflect the views of The Epoch Times.

Posted on