A panel of government experts proposes to regulate private communications through “online harm” legislation

[ad_1]

Many experts in the panel, carefully selected by the federal government to lay the groundwork for future “online harm reduction” bills, have stated that private communication should be included in the framework.

“Some experts say that high levels of harmful content, such as terrorist content and child pornography, are often shared via private communications rather than public forums, and excluding these types of communications, many He emphasized that harmful content remains. Table ” summary Of the discussions held by the expert panel.

In March, Heritage Minister Pablo Rodriguez announced that 12 experts would attend discussions with bureaucrats from various agencies to advise on the drafting of the Internet Content Control Bill.

Ten sessions were held from April to June, with government-provided summaries posted online.

Experts pointed out that terrorist content and child exploitation need to be countered by private communication, but in another session, “disinformation” is “the most imminent and harmful form of malicious intent online.” I have identified it as one of the actions.

Despite calling on the government to tackle “disinformation,” experts said the issue would still be difficult to define by law. They also said the government should not decide what is true or false.

Regarding the regulation of private communication, experts suggest that the platform use tools that “mitigate risk before it occurs” or that have a reporting mechanism to address “harmful content”. did.

“Thus, regulations do not have to impose preventative monitoring obligations on the platform to monitor private communications to mitigate harm,” the panel said.

“Legal but harmful”

While freedom of expression is protected in Canada and existing law prohibits malicious expression, the Panel is a way to counter content that is legal but may be considered “harmful” online. I considered.

“Some experts argued that we needed to balance the rights of the Charter while also dealing with legal but harmful content,” reads the session. summary..

“It was also stated that while legal but harmful content cannot be legally banned, it can be regulated by means other than removal measures.”

Some experts argued that the law should be ambiguous to give the platform an incentive to “do more to comply” in content regulation, while others have argued. Insisted that it would give the platform too much room.

Despite disagreements, Heritage Canada states that there is consensus among experts that a regulatory system is needed to tackle “harmful content” online.

Experts say that “such a framework can contribute, erode, or strengthen public confidence in the government and other democratic institutions,” so the way governments communicate their efforts to regulate content is “important.” “.

Experts on platform compliance Said That “public shame or incentive for profit” will be the “key to a successful framework.”

Politicalization

A previous analysis of the panel by 12 experts on various issues such as COVID-19 measures, increased vaccination mandates, labeling of “conspiracy” from an alternative perspective, and recent freedom-themed protests. It was shown to share most of the government’s ideology.

Some experts warning In the discussion, the law introduced to regulate content “should not be affected by future government misuse.”

The content they are trying to regulate includes “misunderstanding political communication” as well as false advertising and publicity.

Discussion summaries are often infused with progressive jargon.

“Many experts have stated that it is important to find a way to define harmful content in a way that brings about a living experience and crossing,” said one summary.

“They explained that many harms online are exemplified by issues such as colonization and misogyny, and the regulatory framework needs to be aware of these factors.”

Some experts Expression Concerned that the platform could be overly cracked down, “label activist content such as Black Lives Matter campaign material as radical content.”

commissioner

Another area agreed upon among experts is the need to create a digital safety commissioner.

they Said The commissioner must have the authority to audit, inspect, manage and initiate investigations of fines.

However, there was disagreement over the extent of power that should be given to this new position. Some experts have stated that “teeth” are needed to enforce compliance.

The idea of ​​creating a digital safety commissioner was already proposed by liberal last year and it has technical features. paper Published by Heritage Canada in April.

Some experts complement the position of the new commissioner Said Citing that the platform does not have “legality” to make those decisions, we need a “cyber judge” to determine the legality of content posted online.

Noe Chartier

follow

Noé Charter is a Montreal-based Epoch Times reporter. Twitter: @NChartierET Gettr: @nchartieret

[ad_2]