Meta has outlined extensive plans to limit the spread of “wrong information” and enhance “election integrity” in the upcoming Australian federal elections by expanding the fact-checking network.
According to Josh Matin, Head of Public Policy at Meta Australia, Facebook and Instagram’s parent company has invested considerable resources in working with local groups to put together a complete set of election integrity measures. I’ve been doing it. Press release..
As part of Tech Giant’s plans, Meta has expanded its local third-party fact-checking network to include RMITFact Lab and Australian Associated Press (AAP) in addition to its existing partner Agence France Presse.
Meta is also providing additional funding to expand its ability to rate and review content as elections approach.
“We also offer a one-time grant to all fact checkers to improve their capacity for elections,” he said. “Our fact checkers work to reduce the spread of false information throughout Meta’s services, because when they rate something wrong, we significantly reduce its distribution. , Less people will see it. We will also notify people who are trying to share what has been rated as false and add a warning label with a link to a ranting article. “
According to comments obtained by ZDNet, Matin confirmed that the politician’s allegations were not confirmed.
“The politician’s speech has already been very scrutinized,” he said.
“It is being scrutinized by [journalists]Not only scholars, experts, and their political opponents who are in a pretty good position to push back or show that they do not believe that something is right if they think they are mischaracterized. Also by. “
In addition to excluding and limiting the scope of “harmful or misleading election misinformation,” Manchin said Australians could “make informed decisions about what to read, trust, and share.” He said he needed to be informed about how he could.
To facilitate this, Meta has partnered with a monitoring organization. First draft Develop “don’t misunderstand” programs across technology platforms. This program teaches creators and influencers to share tips for finding “false news.”
They are also working on AAP’s “Check The Fact” campaign, a series of short videos translated into Vietnamese, Simplified Chinese, and Arabic.
“These are the three largest communities of non-English speaking people in Australia, and we have been very aware of the risk of potential false alarms, especially among Chinese speaking communities,” said Machin. rice field. AAP..
Meta enforces community standards for politicians
Meta commented on the election from Matin Grilled by United Australia Party Leader Craig Kelly MP At the hearing of the Special Committee on Social Media and Online Safety on March 2nd.
Kelly, whose Facebook and Instagram pages were banned by Meta in 2021 for violating the company’s misinformation policy on postings containing information about the use of ivermectin and hydroxychloroquine for the treatment of COVID-19, said Meta was a “party. There is no meta-foreign interference in Australian elections by blocking, banning shadows, or removing political parties from the platform.
“We consistently apply policies and community standards to users on the platform, whether individuals or public figures,” says Machine.
“If any part of the content violates community standards, yes, we will remove it. This is a very important thing we are doing to protect the safety and integrity of the campaign. It’s protection. “
Kelly also asked if Meta considered its community standards to be using these standards to undermine or threaten Australia’s democracy.
“If an elected candidate or someone who is a registered political party makes a particular statement, if it somehow does not meet the standards of your community, or one of your fact checkers If you disagree with it, you’re saying that, you’re actually blocking, censoring, shadow-banging or de-plating that politician candidate or party. It’s a direct foreign to the Australian campaign. Isn’t it an interference? “Kelly’s question.
Machin responded by clarifying when the fact checker told Facebook that some of the content was false.
“We label it — interstitial — make content visible only if people deliberately choose to click through, and take steps to reduce the distribution of that content. : We will remove it from our recommendations and display it at the bottom of your feed. However, I would like to make it clear that the fact checker will not remove content on a large scale based on what it determines to be false. We will remove content that violates community standards, “Machin said.
Meta to take advantage of previous election experience
Meta also said in a press release that it will leverage its experience in more than 200 elections around the world to enhance cybersecurity and combat threats such as organizationally influential operations. Australia alone has invested A $ 7 billion in security and security, Matin said.
“We have a dedicated global team to identify and address election threats, including signs of coordinated fraudulent behavior across the app. We also have a dedicated global team for elections in government elections. We are coordinating with the Integrity Assurance Task Force and security agencies. ” “We’ve also improved the AI so that we can more effectively detect and block fake accounts that are often behind this activity.”