Meta Ends Fact-Checking, Introduces Community-Based Moderation

Freedom of Speech on Social Media: A Shift Towards Open Discussion

Freedom of speech online has always been a priority for Meta. With billions of people expressing themselves, social media platforms have become a space for diverse opinions—both positive and controversial.

In 2019, Mark Zuckerberg warned that restricting speech, even with good intentions, often strengthens existing power structures instead of giving a voice to ordinary people. “Some believe that more voices lead to greater polarization. That’s dangerous,” he said.

Over time, Meta implemented strict content moderation rules, but they went too far. Too much content has been censored, many users have been unfairly placed in “Facebook jail,” and responses to these situations have been slow.

Now, Meta is shifting its approach—fewer restrictions and more room for open discussion. The platforms will return to their original goal: allowing people to express themselves freely.

Fact-Checking & Freedom of Speech Online: The Shift to Community Notes

When Meta launched its independent fact-checking program in 2016, the goal was not to determine what is true. Instead, the company entrusted external organizations to provide more information about viral hoaxes and help users evaluate what they read.

However, this approach did not work as expected, especially in the U.S. Experts, like everyone else, have their own biases, which influenced the selection and verification of content. Many legitimate political discussions were flagged and restricted, turning the program into a tool of censorship rather than a source of information.

As a result, Meta is changing its strategy and introducing Community Notes, similar to the system used on X. This new approach allows users to add context to posts based on a consensus of diverse perspectives, reducing the risk of biased evaluations.

Meta will not interfere with Community Notes or decide which ones appear. Users on Facebook, Instagram, and Threads can participate and contribute. The program will launch first in the U.S. and gradually expand.

Content Moderation Under Scrutiny

Over the years, Meta’s content moderation systems have become increasingly complex and overly restrictive. This excessive regulation has stifled legitimate political debates and censored harmless content.

In December 2024 alone, millions of posts were removed daily. Although this represents less than 1% of total content, Meta estimates that 10–20% of these removals may have been mistakes. To address this, Meta plans to increase transparency by regularly publishing statistics on incorrect content removals.

Meta is also lifting certain restrictions on topics such as immigration and gender identity, which are often at the center of public debate. Additionally, enforcement policies will be adjusted—automated systems will focus only on illegal content, such as terrorism, child exploitation, drugs, and fraud. For less severe violations, content removal will only occur if reported by users instead of automatic intervention.

To reduce errors in content moderation, Meta is introducing multi-step verification and using AI-powered second opinions before making enforcement decisions. The account recovery process is also being improved, including the testing of facial recognition technology for identity verification.

These changes aim to ensure greater freedom of speech and a fairer approach to content moderation across Meta’s platforms.

Political Content Based on User Preferences

Since 2021, Meta has limited the visibility of political and civic content based on user feedback. However, this approach was too rigid, so it is now being adjusted—allowing users to control how much political content they see.

Meta has tested personalized content delivery and will now treat posts from followed pages and people on Facebook like any other content. The ranking system will consider both explicit signals (such as likes) and implicit signals (such as time spent viewing a post). Based on these interactions, Meta will recommend more political content to those who are interested, while also providing additional control over content visibility.

These changes are part of Meta’s commitment to free expression, a principle Mark Zuckerberg championed in his Georgetown speech. The goal is to ensure that its platforms remain spaces for open discussion, allowing people to share their views without unnecessary restrictions.

Infographic showing changes in freedom of speech online policies
Src: Unsplash

Enjoyed this article? There’s more to explore! Check out our latest blog for fresh insights and trends.

Src: Meta Newsroom, NBC News

Januar 2024