Every dating app, social platform, and UGC marketplace faces the same ugly problem: people upload things they shouldn't.
And every engineering team has to solve it.
We just launched PixelAPI's NSFW Content Moderation API — and at $0.0005 per image, it's half the price of AWS Rekognition and Google Content Safety.
Here's what we learned building it.
The Market Reality
The big cloud players charge $0.001 per image for content moderation. AWS. Google. Microsoft. They're all roughly in the same ballpark.
That sounds cheap until you're processing 10 million images a month — and suddenly you're looking at $10,000 in moderation bills.
For startups and indie developers, that's a significant chunk of your infrastructure budget. For bigger platforms, it's table stakes — but even they want to optimize.
We already had the GPU infrastructure for PixelAPI's image editing API. Adding an NSFW classifier was a natural extension.
What We Built
The API is straightforward:
curl -X POST https://api.pixelapi.dev/v1/moderation/classify \
-H "Authorization: Bearer YOUR_API_KEY" \
-F "image_urls=https://example.com/photo.jpg"
Response:
{
"moderation_id": "596200d8-a1cf-4e96-883b-4c22d0ad45d2",
"credits_used": 1,
"results": [{
"label": "safe",
"nsfw_score": 0.0003,
"safe_score": 0.9997,
"confidence": 0.9997
}]
}
Three things we're proud of:
1. It's fast. GPU-powered classification — 50ms per image on our RTX 6000 Ada setup. Not the 2-5 seconds you'd get with a CPU-only model.
2. It returns actual scores. Not just "flagged: true/false." You get nsfw_score (0.0 to 1.0), safe_score, and an overall confidence. You decide your threshold — some apps want to auto-block at 0.5, others want to flag at 0.1 for human review.
3. One API for images and video. Most competitors charge separately for video frame analysis. We let you sample frames from a video and check each one — same price, same API.
The Hardest Part Wasn't the Model
The Falconsai NSFW model (ViT-based, 86M parameters) was already cached on our GPU machines. Loading it and running inference was the easy part.
The hard part was everything else:
- Credit management — charging the right amount, refunding on failure, handling timeouts
- Queue management — what happens when 100 apps hit the API simultaneously
- Result polling — the model processes fast, but network latency and queue wait means the synchronous response needs a polling fallback
- Worker priority — our GPU machines run image gen, video gen, 3D modeling, NSFW classification, AND background tasks. We had to build a priority stack so revenue-generating jobs always get GPU time before things like test renders
The NSFW classifier itself was 5% of the work. The infrastructure around it was 95%.
Pricing: Why $0.0005?
Our rule for everything at PixelAPI: exactly 2x cheaper than the cheapest mainstream competitor. Not more, not less.
AWS charges $0.001 per image. We charge $0.0005.
That puts us at:
- $500/month for 1 million images
- $5,000/month for 10 million images
Compare that to AWS at $1,000-$10,000 for the same volume.
Is AWS "better"? Their moderation model is probably trained on more data, has more edge cases covered. But for 95% of use cases — dating apps, community platforms, marketplaces — our accuracy is indistinguishable from "good enough." And at half the price, you can afford to be more aggressive with your thresholds.
Who It's For
If you're building:
- A dating app where users upload profile photos
- A social platform with user-generated content
- A marketplace where sellers list items with photos
- A community platform with image uploads
- An ad network that serves visual creatives
…this is for you.
You don't need AWS if you're not running AWS for everything else. You don't need Google Cloud if you're not already in their ecosystem. You need a simple API key, a few lines of code, and a price that doesn't make you flinch.
What's Next
Video moderation (built-in frame sampling) is on the roadmap. We'll sample frames from a video file and return aggregated NSFW scores across all frames — no extra work on your end.
We're also working on custom thresholds per category (violence vs. adult content vs. gore) for apps that need granular control.
The API is live now. You can read the docs at pixelapi.dev/moderation-api.html and get started with a free API key at pixelapi.dev/app.
If you hit limits or need enterprise pricing, talk to us.
Top comments (0)