NSFW Scraper avatar
NSFW Scraper

Deprecated

Pricing

$2.50 / 1,000 results

Go to Store
NSFW Scraper

NSFW Scraper

Deprecated

Developed by

Datastorm

Datastorm

Maintained by Community

Check Images and Text for NSFW (not safe for work) content. Input can be an image url, a phrase or a dataset. Scan text for offensive, sexually explicit, self-harm, violence, racist or hate speech. Detect nudity, sexual activity, pornography, violence, gore content, etc in images.

0.0 (0)

Pricing

$2.50 / 1,000 results

1

Total users

6

Monthly users

4

Last modified

2 years ago

You can access the NSFW Scraper programmatically from your own applications by using the Apify API. You can also choose the language preference from below. To use the Apify API, you’ll need an Apify account and your API token, found in Integrations settings in Apify Console.

$echo '{
< "items": [
< "https://firebasestorage.googleapis.com/v0/b/kongo-ln.appspot.com/o/girl%20sexy%20hot%20sun%20sea%20beach.jpg?alt=media&token=7ec36226-6a7a-4241-99a3-5bde4bbde2c6",
< "https://firebasestorage.googleapis.com/v0/b/kongo-ln.appspot.com/o/emma%20watson%20in%20bikini%20on%20the%20beach.jpg?alt=media&token=73af0ea9-c80e-4aa8-be7c-7e79917e6559",
< "https://firebasestorage.googleapis.com/v0/b/kongo-ln.appspot.com/o/chris%20evans%20as%20peaky%20blinders%20higly%20detailed.jpg?alt=media&token=e8167234-c864-4df3-9590-2a932edb0a4a"
< ],
< "datasets": [
< {
< "id": "serTsUnXrLOVXjPtd",
< "fields": [
< "value"
< ]
< }
< ]
<}' |
<apify call datastorm/nsfw-scraper --silent --output-dataset

NSFW Scraper API through CLI

The Apify CLI is the official tool that allows you to use NSFW Scraper locally, providing convenience functions and automatic retries on errors.

Install the Apify CLI

$npm i -g apify-cli
$apify login

Other API clients include: