Reclaiming Visibility, Surviving Biased Moderation

ContentChecker a privacy-first, open-source AI tool that helps you scan and score your images before posting them, so you can understand how automated moderation systems might classify your content


🚨 My friend lost her income overnight. Again.

A friend of mine—let’s call her Nyah—is a brilliant erotic performer and educator. Her content isn’t just provocative; it’s artful, healing, and rooted in deep knowledge of her body and desire. Her platform? A mix of personal erotic expression and public sex-education —designed to remind people of the power of pleasure in a world that keeps trying to take it away. But every time she shares a new post—whether it’s an intimate boudoir image or a gentle PSA about pleasure—she’s gambling with her livelihood.

One morning, she logged in to discover another notice. Her platform is gone! No warning. No appeal. Her reach gutted, her visibility suppressed. Her income? Gone. All because an automated system decided her body—or her joy—was too much.

Nyah’s story is not unique. And that’s the problem.

In a digital world where creators are forced to walk a fine line between expression and erasure,

Modern platforms like Instagram, TikTok, and Facebook use aggressive AI-based moderation to police content. While this technology may seem neutral, it’s far from fair.

These algorithms regularly:

  • Mislabel queer, Black, and fat bodies as explicit—even when they’re clothed

  • Suppress erotic expression that exists outside the puritanical, patriarchal or heteronormative gaze

  • Punish sex workers and educators, even when they follow platform guidelines

  • Destroy livelihoods with no recourse, no transparency, no justice

in Weapons of Math Destruction, Cathy O’Neil explains how opaque, unaccountable algorithms can destroy lives by quietly encoding systemic racism, fatphobia, ableism, and anti-sex-worker stigma into everyday tech.

These aren’t just glitches. These are structural injustices disguised as neutral systems.

⚠️ Classification Is Never Neutral

“When your content is suppressed, your voice is erased.”

Whether you’re posting art, education, activism, or erotic self-expression, automated moderation systems decide if you’re visible—or if you’re silenced. They erase bodies, ideas, and labor they don’t understand. And the odds?
Are stacked against you.

because your body doesn’t fit Eurocentric standards.
because pleasure is politicized.
because someone coded your truth as “too much.”

That’s why I built ContentChecker—a privacy-first, open-source AI tool that helps you scan and score your images before posting them, so you can understand how automated moderation systems might classify your content.

Powered by the Yahoo Open-NSFW model, ContentChecker returns a score from 0.0 to 1.0:

  • < 0.2 – Likely Safe for Work

  • > 0.8 – High risk of NSFW flagging

  • 0.2 – 0.8 – Uncertain zone, use your own discretion

🔐 All analysis happens locally on your device. No uploads. No surveillance. Your images stay yours.

Nyah’s story isn’t about a glitch in the system— it is the system. As Virginia Eubanks outlines in Automating Inequality, algorithmic tools are often deployed in ways that systematically strip power from the poor, the marginalized, and the “undesirable.”

In the digital world, erotic creators like Nyah live under algorithmic redlining—cut off from tools, income, and connection by models that treat bodies and desires as threats.

ContentChecker is a proof of concept for something different.

✅ A tool to help creators stay one step ahead of the algorithm ✅ A chance to regain clarity and peace of mind when you post ✅ Exposes how your content might be misread by content moderation systems
✅ Empowers creators to make informed decisions before posting
✅ Helps avoid shadowbans, deplatforming, and income loss
✅ Protects your privacy by running entirely on your device

ContentChecker offers clarity before catastrophe.

ContentChecker isn’t just about protecting your next post. It’s a call to build better systems—ones that reflect us, not erase us. This is not about playing by the rules. It’s about understanding the rules, seeing the game, and reclaiming power in spaces designed to erase you. This isn’t a fix for broken systems. It’s a survival tool, a stopgap, and a proof of concept for something bigger.

ContentChecker is only the beginning. Imagine tools that:

  • Learn from your edge cases
  • Allow community-contributed corrections
  • Empower creators instead of policing them
  • Tech is built with marginalized people—not just on top of them
  • Erotic labor is treated as legitimate work, not a policy violation
  • Communities can crowdsource misclassifications to improve accuracy and inclusivity
  • Creators can train their own moderation models based on their content and values

🚨 Your Post Shouldn’t Cost You Your Livelihood 🚨

🔗 Try It, Fork It, Share It

This is an open-source tool for creators, coders, and communities looking to survive—and resist—the algorithm.

Use it. Tweak it. Build something better.

Because your body, your voice, your truth and your labor deserve more than silence.
Because no one should have to choose between survival and self-expression. Because pleasure is not a problem.
Because bodies are not dangerous.
Because your livelihood should never hinge on the whims of a black-box algorithm.

github repo

link to demo got a problem?

🧠 Further Reading / Context

  • Algorithms of Oppression by Safiya Umoja Noble

  • Weapons of Math Destruction by Cathy O’Neil

  • Automating Inequality by Virginia Eubanks


P.S.
I’m still looking for work!

If you know anyone seeking a software engineer with a deep love for innovation, tinkering, and purpose-driven technology—please connect us.

I believe code is more than logic—it’s magic. A language that, when spoken fluently, can shape communities, build futures, and bring ideas to life.

My work is:

  • Rooted in logic
  • Fueled by passion
  • Always connected to community

I build with intention. I innovate with care. I’m deeply grounded in the impact of my craft.

Code is a tool, but people are the purpose.
This is my story of how I fell in love with my craft.


« View More Posts