New definition of hate to be included in Liberal bill that might also revive contentious hate speech law
Anja Karadeglija
POSTMEDIA
3/3/2021
Federal legislation expected to be tabled within weeks will see a new statutory definition of hate and could also see the reincarnation of a controversial hate speech law
© Provided by National Post It’s expected that Heritage Minister Steven Guilbeault could table the new hate speech legislation in the next few weeks.
The new definition, part of legislation aimed at tackling online hate content, will be based on previous court decisions and how the Supreme Court has defined hate, said Arif Virani, parliamentary secretary to Justice Minister David Lametti.
But the government has also not ruled out introducing a form of a controversial hate speech law that was widely criticized over free speech rights. The law, Section 13 of the Canadian Human Rights Act, was repealed in 2013 after critics said it amounted to censorship.
Section 13 was “something we consulted on,” said Virani. “We heard feedback on both sides of the ledger. That’s something we’re examining.”
Christine Van Geyn, litigation director of the Canadian Constitution Foundation, an organization devoted to defending constitutional freedoms, said the previous civil remedy under Section 13 was “widely seen as highly politicized and discretionary.”
Bringing it back is not the way to tackle online hate, she said. She’s also concerned any new regulator created under the legislation would “use their discretion in a politicized way.”
“Expanding the administrative state and creating new regulators and tribunals with large discretionary authority isn’t the right approach,” Van Geyn wrote in an email.
Chris Selley: Free speech is free, even on social media
'Weaponization' of free speech prompts talk of a new hate law
The bill will include the creation of a new regulator who will be responsible for enforcing the new statutory definition of hate, including requiring online platforms to take down illegal content within 24 hours.
Heritage Minister Steven Guilbeault said he expects to table the legislation in weeks.
The bill will set out a legal framework for “illegal content” covering five categories: hate speech, terrorist content, content that incites violence, child sexual exploitative content and non-consensual sharing of intimate content. The new regulator will then “develop the how” of putting that framework into practice, Guilbeault said.
The new regulator will be tasked with enforcing transparency, including about the platforms’ algorithms. It will also be responsible for enforcement, with the power to levy significant fines.
Guilbeault’s office confirmed the bill won’t expand the definition of illegal content beyond what’s already in the Criminal Code.
A spokesman for Virani said the new definition of hate will be informed by past court cases including the Supreme Court’s Saskatchewan Human Rights Commission v Whatcott case. The Whatcott decision invoked case law involving a list of “hallmarks of hatred” which, according to University of Windsor professor Richard Moon, is a list of indicators that give reason to regard that speech as hateful.
“Among the hallmarks of hatred were things like identifying the group as subhuman or animalistic, or as unclean or carrying disease or as inclined to engage in violence or things of that nature,” Moon said.
Moon said in the 1990 Supreme Court R. v Keegstra decision the court “talked about hate speech as speech that expressed that detestation towards the group, the vilification of the group.”
Virani held consultations last year with about 75 different groups and individuals, which included asking whether the government should bring back Section 13.
The new definition, part of legislation aimed at tackling online hate content, will be based on previous court decisions and how the Supreme Court has defined hate, said Arif Virani, parliamentary secretary to Justice Minister David Lametti.
But the government has also not ruled out introducing a form of a controversial hate speech law that was widely criticized over free speech rights. The law, Section 13 of the Canadian Human Rights Act, was repealed in 2013 after critics said it amounted to censorship.
Section 13 was “something we consulted on,” said Virani. “We heard feedback on both sides of the ledger. That’s something we’re examining.”
Christine Van Geyn, litigation director of the Canadian Constitution Foundation, an organization devoted to defending constitutional freedoms, said the previous civil remedy under Section 13 was “widely seen as highly politicized and discretionary.”
Bringing it back is not the way to tackle online hate, she said. She’s also concerned any new regulator created under the legislation would “use their discretion in a politicized way.”
“Expanding the administrative state and creating new regulators and tribunals with large discretionary authority isn’t the right approach,” Van Geyn wrote in an email.
Chris Selley: Free speech is free, even on social media
'Weaponization' of free speech prompts talk of a new hate law
The bill will include the creation of a new regulator who will be responsible for enforcing the new statutory definition of hate, including requiring online platforms to take down illegal content within 24 hours.
Heritage Minister Steven Guilbeault said he expects to table the legislation in weeks.
The bill will set out a legal framework for “illegal content” covering five categories: hate speech, terrorist content, content that incites violence, child sexual exploitative content and non-consensual sharing of intimate content. The new regulator will then “develop the how” of putting that framework into practice, Guilbeault said.
The new regulator will be tasked with enforcing transparency, including about the platforms’ algorithms. It will also be responsible for enforcement, with the power to levy significant fines.
Guilbeault’s office confirmed the bill won’t expand the definition of illegal content beyond what’s already in the Criminal Code.
A spokesman for Virani said the new definition of hate will be informed by past court cases including the Supreme Court’s Saskatchewan Human Rights Commission v Whatcott case. The Whatcott decision invoked case law involving a list of “hallmarks of hatred” which, according to University of Windsor professor Richard Moon, is a list of indicators that give reason to regard that speech as hateful.
“Among the hallmarks of hatred were things like identifying the group as subhuman or animalistic, or as unclean or carrying disease or as inclined to engage in violence or things of that nature,” Moon said.
Moon said in the 1990 Supreme Court R. v Keegstra decision the court “talked about hate speech as speech that expressed that detestation towards the group, the vilification of the group.”
Virani held consultations last year with about 75 different groups and individuals, which included asking whether the government should bring back Section 13.
© Ian MacAlpine/Postmedia/File “Expanding the administrative state and creating new regulators and tribunals with large discretionary authority isn’t the right approach,” says Christine Van Geyn of the Canadian Constitution Foundation.
Prior to 2013, the Canadian Human Rights Tribunal could issue cease-and-desist orders and impose fines up to $10,000 in response to complaints from individuals about matters likely to expose them “to hatred or contempt” for the reason of those individuals being “identifiable on the basis of a prohibited ground of discrimination.”
The Canadian Civil Liberties Association was among the groups that opposed Section 13 and cautioned the government not to bring it back, arguing human rights tribunals aren’t an appropriate mechanism for addressing hate speech. Cara Zwibel, the director of the CCLA’s fundamental freedom’s program, said in an email that having the new regulator, instead of the Canadian Human Rights Tribunal, in charge of addressing online hate does “address some of the group’s concerns.”
But she said it “would not address all of the concerns and could give rise to new ones,” including questions about due process.
“There is a general concern with regulating hate speech which is that, despite the definition that courts have developed, the concept is fundamentally subjective. This is one of the concerns that is not addressed even if a new regulatory mechanism is used,” Zwibel said.
The CCLA also told the government that it was not necessary to develop a new definition of hate. Zwibel said in an interview that having hate content codified in legislation doesn’t allow that definition to “grow and evolve.”
The concern for the CCF’s Van Geyn is that that any definition of hate “will be inherently subjective and prone to an expansive interpretation that could improperly capture all kinds of expression that the government has no business censoring.”
But Evan Balgord, the executive director of the Canadian Anti-Hate Network, said in an interview there is a need for a civil remedy because the current system, in which hate speech is dealt with through the Criminal Code, isn’t working. So few people end up charged with hate speech that the prospect of criminal charges doesn’t act as a deterrent, he said.
Balgord said the definition of hate found in previous court decisions offers a good basis for a regulator to be able to tackle the problem of online hate, as long as the regulator is able to keep up to date with constantly changing code words used by neo-Nazi and other hate groups.
“We need a good solution now, we don’t need a perfect solution in two years,” he said. Sticking to what the court has previously defined as hate allows the government to defend itself against criticism that the government is attacking free speech, Balgord said.
Virani acknowledged that when it comes to the definition, “there may be concerns that it is not broad enough for some people. There may be concerns that it is too broad for others.”
The role of the new regulator will be to give clarity on the new definition to the online companies and ensure they are taking appropriate action, Virani said.
The new legislation will also force online platforms to take down illegal content within 24 hours. Virani couldn’t specify whether the regulator will be the one determining what content has to be taken down within the 24 hours, or if that initial decision will be left to the companies, with the regulator responsible for complaints after the fact. He said it will be a combination of those scenarios, but there’s no “one straight answer to that question. It’s still being worked out.”
That 24-hour takedown period could also create a potential for a constitutional challenge. If platforms have to make the call about whether something is illegal, and they face penalties for failing to do so, that creates the incentive to err on the side of taking content down, said Vivek Krishnamurthy, director of the University of Ottawa’s Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic (CIPPIC).
“And that can have an impact on free expression,” he said. “We know that some things are very clearly illegal, but there’s a lot of grey.”
A spokesperson for Google also flagged the same issue, saying the company remains “uneasy” about any law that “prioritizes speed where decisions require careful consideration, which in turn will likely lead to over-removal of legitimate expression and have a negative impact on diversity of voices.”
Balgord said it’s not feasible for the regulator to decide what is or is not hate content before the platforms have to take it down, given the sheer volume of hate content that is posted online. “My question is, have they been on the internet? If you’re going to hold a little adjudication every time there is hate speech, it’s impossible… there’s simply too much of it.”
The government didn’t hold public consultations while developing the legislation, but Guilbeault conducted his own in parallel to Virani’s. He said he consulted “far and wide,” speaking to platforms, academics, researchers and not-for-profit groups, who deal with issues like child sexual exploitation, Canada’s democratic institutions, freedom of speech, and “people who are very concerned with how equity seeking groups are being targeted.”
Like Virani, Guilbeault wasn’t concerned about a constitutional challenge to his new legislation.
“The beauty of living in a democratic society” is that people have the ability to challenge laws using the justice system, he said.
“I think we are working really hard to ensure that we are very on solid legal ground. Is it something that could happen? Absolutely. But I’m not worried about it.”
Prior to 2013, the Canadian Human Rights Tribunal could issue cease-and-desist orders and impose fines up to $10,000 in response to complaints from individuals about matters likely to expose them “to hatred or contempt” for the reason of those individuals being “identifiable on the basis of a prohibited ground of discrimination.”
The Canadian Civil Liberties Association was among the groups that opposed Section 13 and cautioned the government not to bring it back, arguing human rights tribunals aren’t an appropriate mechanism for addressing hate speech. Cara Zwibel, the director of the CCLA’s fundamental freedom’s program, said in an email that having the new regulator, instead of the Canadian Human Rights Tribunal, in charge of addressing online hate does “address some of the group’s concerns.”
But she said it “would not address all of the concerns and could give rise to new ones,” including questions about due process.
“There is a general concern with regulating hate speech which is that, despite the definition that courts have developed, the concept is fundamentally subjective. This is one of the concerns that is not addressed even if a new regulatory mechanism is used,” Zwibel said.
The CCLA also told the government that it was not necessary to develop a new definition of hate. Zwibel said in an interview that having hate content codified in legislation doesn’t allow that definition to “grow and evolve.”
The concern for the CCF’s Van Geyn is that that any definition of hate “will be inherently subjective and prone to an expansive interpretation that could improperly capture all kinds of expression that the government has no business censoring.”
But Evan Balgord, the executive director of the Canadian Anti-Hate Network, said in an interview there is a need for a civil remedy because the current system, in which hate speech is dealt with through the Criminal Code, isn’t working. So few people end up charged with hate speech that the prospect of criminal charges doesn’t act as a deterrent, he said.
Balgord said the definition of hate found in previous court decisions offers a good basis for a regulator to be able to tackle the problem of online hate, as long as the regulator is able to keep up to date with constantly changing code words used by neo-Nazi and other hate groups.
“We need a good solution now, we don’t need a perfect solution in two years,” he said. Sticking to what the court has previously defined as hate allows the government to defend itself against criticism that the government is attacking free speech, Balgord said.
Virani acknowledged that when it comes to the definition, “there may be concerns that it is not broad enough for some people. There may be concerns that it is too broad for others.”
The role of the new regulator will be to give clarity on the new definition to the online companies and ensure they are taking appropriate action, Virani said.
The new legislation will also force online platforms to take down illegal content within 24 hours. Virani couldn’t specify whether the regulator will be the one determining what content has to be taken down within the 24 hours, or if that initial decision will be left to the companies, with the regulator responsible for complaints after the fact. He said it will be a combination of those scenarios, but there’s no “one straight answer to that question. It’s still being worked out.”
That 24-hour takedown period could also create a potential for a constitutional challenge. If platforms have to make the call about whether something is illegal, and they face penalties for failing to do so, that creates the incentive to err on the side of taking content down, said Vivek Krishnamurthy, director of the University of Ottawa’s Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic (CIPPIC).
“And that can have an impact on free expression,” he said. “We know that some things are very clearly illegal, but there’s a lot of grey.”
A spokesperson for Google also flagged the same issue, saying the company remains “uneasy” about any law that “prioritizes speed where decisions require careful consideration, which in turn will likely lead to over-removal of legitimate expression and have a negative impact on diversity of voices.”
Balgord said it’s not feasible for the regulator to decide what is or is not hate content before the platforms have to take it down, given the sheer volume of hate content that is posted online. “My question is, have they been on the internet? If you’re going to hold a little adjudication every time there is hate speech, it’s impossible… there’s simply too much of it.”
The government didn’t hold public consultations while developing the legislation, but Guilbeault conducted his own in parallel to Virani’s. He said he consulted “far and wide,” speaking to platforms, academics, researchers and not-for-profit groups, who deal with issues like child sexual exploitation, Canada’s democratic institutions, freedom of speech, and “people who are very concerned with how equity seeking groups are being targeted.”
Like Virani, Guilbeault wasn’t concerned about a constitutional challenge to his new legislation.
“The beauty of living in a democratic society” is that people have the ability to challenge laws using the justice system, he said.
“I think we are working really hard to ensure that we are very on solid legal ground. Is it something that could happen? Absolutely. But I’m not worried about it.”
No comments:
Post a Comment