Back
3 Mar 2025

Meta ditches fact-checking program and adopts community-driven moderation

Meta’s shift to community-driven fact-checking sparks debate on misinformation control and regulatory scrutiny.

Earlier this year, Meta CEO Mark Zuckerberg announced the company would end its US fact-checking program, replacing it with a “Community Notes” system. This shift allows users to flag potentially misleading posts and provide additional context, aiming to empower the community to address misinformation directly on the platform.
With this relaxation of policy, Dunedin NZ has taken the opportunity with a tongue in cheek post to claim the most liveable place in New Zealand [and we think so it should!!].

Meta has stated that independent fact-checking units in Australia will continue their efforts. However, it noted that “before rolling out any changes to our fact-checking program outside the US, we will carefully consider our obligations in each country, including Australia.”

This change reflects a broader trend in social media content moderation, where platforms are turning to community-driven models to manage misinformation.

The Australian Communications and Media Authority (ACMA) and other regulatory bodies have been monitoring misinformation on digital platforms, with the federal government considering stronger intervention if voluntary measures prove insufficient. Considering Australia’s stance on combating digital misinformation, a move by Meta to change its approach here will likely face significant scrutiny.

Any shift in social media regulation and platform policies could directly impact your communication strategies, audience reach, and brand credibility. Here’s how to navigate these changes effectively:

Monitor regulatory developments
Keep up to date with announcements from ACMA or other authorities regarding changes to social media regulations in Australia. Understanding these shifts will help proactively adjust your PR and communication strategies.

Strengthen internal fact-checking
Double down on your content verification processes to prevent misinformation affecting brand reputation.

Leverage multiple platforms
Too much reliance on anyone platform is a risk should they change their terms and conditions, their algorithm on you or your account gets blocked. Diversifying across multiple social media channels is a risk reduction strategy.

Engage with trusted sources
Partner with reputable news organisations, industry experts and professional associations to reinforce brand messaging with credible, fact-based information.

Encourage audience fact-checking
Educate your audience on identifying misinformation and promote engagement with trusted sources. This helps build a more informed and resilient community around your brand.

Chris Hall, Chief Executive Officer, Primary Comms Group