Let’s face it: nudity is an inevitable part of the internet. Spend enough time online and you’re bound to come across some raunchy imagery (whether you intend to or not!). If you’re building an app or website, it’s important to filter out any “not safe for work” (NSFW) images or video that may be uploaded. After all, no one likes a surprise!
Enter the Clarifai API! This tool uses deep-learning technology to recognize categories, objects and tags from images and videos.
Clarifai API Overview
The API is free for up to 5,000 calls a month. You can do two important functions with Clarifai’s API: general or custom image tagging. With general image tagging, you rely on Clarifai’s artificial intelligence to recognize items based on Clarifai’s pre-built public models (ex. general,Food, Wedding, Travel, Color, NSFW and detecting age/gender/ethnicity or faces). Custom image tagging means that you create your own model (say, with images of your company logo or product) and train Clarifai’s AI to recognize that specific image.
Today, we’ll be using Clarifai’s Public NSFW Model. The NSFW model recognizes nudity in images. With this functionality, you could….
- Filter out NSFW image submissions for your Instagram app
- Protect your comments’ section from naughty spammers
- Limit your app’s Giphy searches to G-rated material
While you can connect to the Clarifai API directly, today we’ll use our free tool, RapidAPI. Why? The live coding feature means that you can start making calls to the API right away from within the browser. Use RapidAPI’s Clarifai Public Models package to access Clarifai’s existing models and RapidAPI’s Clarifai machine learning package to build a custom model.
Related: Top Machine Learning APIs.
This tutorial will use the NSFW model in the Clarifai Public Models package.
How To Use The Clarifai API
By the end of this tutorial, you can upload your own images, call the API and see if it actually passes the NSFW test.
Step 1. Get the Clarifai Access Token and Credentials
Clarifai requires an access token before you start making calls to their API. No worries! It’s easy (and more importantly, free) to get one. Here’s how:
- Go to Clarifai’s developer page
- Sign up for an account
- Click the Create Application button (or head to the Developer Dashboard and click “Create a New Application”)
- Copy and save your
client_id
andclient_secret
- Press the Generate Access Token button
Voila! You should now have your client_id,
client_secret
and Access Token for the Clarifai API.
Step 2. Call the API from RapidAPI
Next, head over to RapidAPI.com to run the API and start testing images! Here’s an overview of what you’ll need to do.
- Visit the Clarifai Public Models package page on RapidAPI
- Select the
analyzeImageNSFW
endpoint - Enter your
client_id
andclient_secret
- from the previous step
- Click “Test” to make call
- Check the JSON response to see how probable it is that the image is NSFW
- Sign in to RapidAPI (for free) to export the code snippet in your preferred language and connect your app directly to the API
That’s the overview, but now, for some fun. Let’s try out the Clarifai API with an example.
Example: The Questionable Lamps Photo
The Clarifai API is pretty good at distinguishing safe for work content from the raunchier stuff, even if it’s hard to tell on first glance.
Take this light shade photo, for instance.
Those shadows really make it look like…well, you know what it looks like. Let’s see if Clarifai’s API will be fooled!
First stop, RapidAPI! We’ll go to RapidAPI.com and find the analyzeImageNSFW endpoint on the Clarifai Public Models API package page.
Next, we’ll put in our Clarifai credentials and upload the image (or use a link).
What was the final verdict on the risqué lamps? Clarifai said with 97% certainty that the image is, in fact, safe for work. We couldn’t fool them this time!
Go ahead and try it out. You can either play around with Clarifai’s API on the RapidAPI website or export the code directly into your app. Let us know what you think.
Leave a Reply