Facebook’s Content Moderation Changes: Impact on LGBTQ+ People
On January 7, 2025, Meta CEO Mark Zuckerberg announced major changes to content moderation across Facebook, Instagram, and WhatsApp. These updates include:
- Removing third-party fact-checkers
- Loosening restrictions on hate speech and misinformation
- Replacing fact-checkers with a “Community Notes” system
- Allowing more religious-based speech, even if it targets LGBTQ+ users
Meta says these changes are meant to support free speech, but LGBTQ+ advocacy groups warn they could increase hate speech and misinformation (The Verge).
Specific Changes Affecting LGBTQ+ Users
One of the most controversial changes allows users to describe LGBTQ+ people as “mentally ill” without violating platform policies. Previously, Meta classified such statements as hate speech.
- Meta no longer bans posts that claim LGBTQ+ identities are a mental disorder or abnormal.
- Hate speech policies have been relaxed, particularly for religious-based comments.
LGBTQ+ advocacy groups, including GLAAD, have condemned these changes, saying they make Meta’s platforms more hostile to LGBTQ+ users (Them.us).
The Role of “Community Notes” Instead of Fact-Checking
Meta has replaced its fact-checking teams with a Community Notes system, similar to the model used on X (formerly Twitter).
- Users can add context to posts they believe are misleading.
- Meta argues this reduces bias in moderation decisions.
- Critics say crowdsourced moderation can be unreliable and biased, especially against marginalized groups (Them.us).
Comparing Meta’s Policies to Other Platforms
Platform | Fact-Checking | Hate Speech Rules |
---|---|---|
User-driven “Community Notes” | More relaxed | |
Twitter (X) | Third-party fact-checkers | Moderate |
YouTube | Internal fact-checking team | Moderate |
TikTok | Strict internal review | Strictest |
Meta’s shift away from fact-checking sets it apart from competitors. While platforms like YouTube and TikTok continue to enforce strict moderation, Facebook is moving toward user-driven oversight (Wall Street Journal).
Concerns From LGBTQ+ Advocacy Groups
LGBTQ+ organizations have expressed strong opposition to these changes.
- GLAAD warns that relaxed policies could increase harassment and hate speech.
- The removal of fact-checkers raises concerns about unchecked misinformation.
- Users can now openly claim LGBTQ+ identities are “mental illnesses” without consequence.
Advocates call for Meta to reinstate protections for marginalized communities (The Verge).
Legal and Safety Implications
Meta’s policy changes raise legal and ethical concerns about user safety.
- Possible violations of anti-discrimination laws in some countries.
- Increased risk of lawsuits over hate speech and misinformation.
- There is potential for new regulations as lawmakers scrutinize these changes.
The shift to user-driven moderation means Meta assumes less direct responsibility for harmful content, but this approach may not hold up legally in all jurisdictions.
Best Practices for LGBTQ+ Content Moderation
To create a safer online environment, platforms should:
- Train moderators on LGBTQ+ issues and terminology
- Include LGBTQ+ voices in policy development
- Prohibit slurs and misgendering
- Protect coming-out stories and transition photos
- Work with LGBTQ+ organizations to audit policies regularly
Crowdsourced moderation, like Community Notes, can be risky if it allows biased users to control narratives. Transparency and accountability are essential.
Final Thoughts
Meta’s new policies represent a significant shift in how content is moderated on social media. While the company says these changes promote free speech, LGBTQ+ advocacy groups warn they could lead to more harassment and misinformation.
The long-term impact remains unclear, but LGBTQ+ users and allies should stay informed, use reporting tools, and push for stronger protections.
What do you think about these changes? Let us know in the comments.
Sources
- Meta Will Now Allow Users to Call LGBTQ+ People Mentally Ill – Them.us
- Facebook’s New Hate Speech Policies – The Verge
- Social Media Companies Are Pulling Back on Moderation – Wall Street Journal