DEV Community

Cover image for How to Blur NSFW Content in Images with Python
AI Engine
AI Engine

Posted on • Originally published at ai-engine.net

How to Blur NSFW Content in Images with Python

Some platforms need to blur NSFW content instead of removing it — dating apps, art communities, and news sites blur flagged images and let users opt in. Here's how to detect and blur in Python.

What You'll Build

A Python script that:

  1. Sends an image to the NSFW detection API
  2. Checks if any label exceeds your confidence threshold
  3. Applies a Gaussian blur if flagged
  4. Saves the blurred version

Prerequisites

pip install requests Pillow
Enter fullscreen mode Exit fullscreen mode

Complete Script

import requests
from PIL import Image, ImageFilter
from io import BytesIO
from pathlib import Path

NSFW_API_URL = "https://nsfw-detect3.p.rapidapi.com/nsfw-detect"
HEADERS = {
    "x-rapidapi-host": "nsfw-detect3.p.rapidapi.com",
    "x-rapidapi-key": "YOUR_API_KEY",
    "Content-Type": "application/x-www-form-urlencoded",
}

BLUR_CATEGORIES = {"Explicit Nudity", "Suggestive", "Violence", "Visually Disturbing"}
CONFIDENCE_THRESHOLD = 75
BLUR_RADIUS = 40


def moderate_and_blur(image_url: str, output_dir: str = ".") -> dict:
    # Step 1: Detect
    response = requests.post(NSFW_API_URL, headers=HEADERS, data={"url": image_url})
    labels = response.json()["body"]["ModerationLabels"]

    flagged = [
        l for l in labels
        if l["Name"] in BLUR_CATEGORIES and l["Confidence"] > CONFIDENCE_THRESHOLD
    ]

    if not flagged:
        return {"action": "safe", "path": None}

    # Step 2: Download and blur
    img = Image.open(BytesIO(requests.get(image_url).content))
    blurred = img.filter(ImageFilter.GaussianBlur(radius=BLUR_RADIUS))

    output_path = Path(output_dir) / "blurred_output.jpg"
    blurred.save(output_path, "JPEG", quality=85)

    return {
        "action": "blurred",
        "labels": [f"{l['Name']} ({l['Confidence']:.0f}%)" for l in flagged],
        "path": str(output_path),
    }


result = moderate_and_blur("https://example.com/user-upload.jpg")
if result["action"] == "blurred":
    print(f"Blurred. Flagged: {', '.join(result['labels'])}")
else:
    print("Safe — no blur needed")
Enter fullscreen mode Exit fullscreen mode

Use Cases

  • Dating apps — Blur explicit photos by default, let users reveal
  • Content feeds — Reddit-style NSFW tags with blur overlay
  • News platforms — Auto-blur graphic content with content warnings

Tips

  • Threshold: 75% balances false positives/negatives. Lower for children's platforms, raise for art communities
  • Blur radius: Use 50 for Explicit Nudity, 20 for Suggestive — different intensities for different severity
  • Performance: Generate blurred version once at upload time, cache both versions
  • Pipeline: Process moderation asynchronously — don't block the upload response

👉 Read the full tutorial with more use cases and integration patterns

Top comments (0)