NSFW
Overview
The NSFW Task detects explicit or unsafe content in images and videos.
It analyzes the visual content and returns confidence scores for several categories such as nudity, sexual content, violence, and gore.
When a task completes, it creates an Intelligence file with kind: "nsfw"
and a .json
output containing the detection results.
Example Output
Each field represents the confidence level (0–1) for that specific category.
A higher value means stronger detection likelihood.
Creating an NSFW Task
You can create an NSFW detection task for any image or video file using the ittybit SDK or a direct API request.
Webhook Example
When the task completes, ittybit will send a POST request to your webhook_url
with the results.
You can use this to automatically flag, moderate, or remove content.
This example mirrors the production-ready implementation from
Check every Supabase upload for NSFW content.
File Structure
Property | Type | Description |
---|---|---|
id | string | Unique file ID for the Intelligence file. |
object | string | Always "intelligence". |
kind | string | Always "nsfw". |
detected | boolean | Whether any unsafe content was detected. |
nudity | number | Confidence score (0–1) for nudity detection. |
sexual | number | Confidence score (0–1) for sexual activity or context. |
violence | number | Confidence score (0–1) for violent content. |
gore | number | Confidence score (0–1) for gore or graphic imagery. |
confidence | number | Overall confidence score for the detection result. |
created / updated | string (ISO 8601) | Timestamps for creation and last update. |
Supported Inputs
NSFW tasks work with both image and video sources:
- Image:
.jpg
,.jpeg
,.png
,.webp
- Video:
.mp4
,.mov
,.webm
Common Use Cases
- User-generated content moderation
- Automatic content filtering before publishing
- Flagging or blurring unsafe media
- Age-restricted platform compliance
Example Workflow Automation
You can combine NSFW detection with an automation workflow to process all new uploads automatically:
This automation will run an NSFW task on every newly created media file.