"\n\n\n \n \n \n 403 Forbidden<\/title>\n \n<\/head>\n \n\n \n Server Error<\/h1>\n 403<\/div>\n Forbidden<\/h2>\n
You do not have permission to access this document.<\/p>\n
\n That's what you can do<\/p>\n
Here is an example of a black and white image containing nudity. Neural Machine understands what's depicted in the image despite of skin tone colors.
"\n\n\n \n \n \n 403 Forbidden<\/title>\n \n<\/head>\n \n\n \n Server Error<\/h1>\n 403<\/div>\n Forbidden<\/h2>\n
You do not have permission to access this document.<\/p>\n
\n That's what you can do<\/p>\n
Below there is an example of a picture depicting what may trigger a false positive nudity. Despite that, Neural Machine performs perfectly fine and identifies a totally safe image with "neutral" rating.
"\n\n\n \n \n \n 403 Forbidden<\/title>\n \n<\/head>\n \n\n \n Server Error<\/h1>\n 403<\/div>\n Forbidden<\/h2>\n
You do not have permission to access this document.<\/p>\n
\n That's what you can do<\/p>\n
Below is an image of a shirtless person. It is marked as safe, but score variance denotes that the subject is not completely covered. A "moderate" rating is present. Such scores and ratings can be used to implement customized solutions based on thresholds and flags.
"\n\n\n \n \n \n 403 Forbidden<\/title>\n \n<\/head>\n \n\n \n Server Error<\/h1>\n 403<\/div>\n Forbidden<\/h2>\n
You do not have permission to access this document.<\/p>\n
\n That's what you can do<\/p>\n
Here there is a picture depitcting a lower body area, with a person wearing a slightly uplifted bikini bottom clothing piece. There is also sand, but this does not prevent Neural Machine from understanding the scene. As in the previous case, image is marked as safe, but a "moderate" rating is assigned, together with accordingly related scores. For more information about ratings, check the complete "Examples" section.
"\n\n\n \n \n \n 403 Forbidden<\/title>\n \n<\/head>\n \n\n \n Server Error<\/h1>\n 403<\/div>\n Forbidden<\/h2>\n
You do not have permission to access this document.<\/p>\n
\n That's what you can do<\/p>\n
Below is an example of a drawn picture containing nudity. Webit Machine detects that image type is a "drawing" and detects nudity as well. Image type classification can also be used to moderate content drawn by users differently (e.g. creative communities) in addition to photos.
Warning: file_get_contents(https://i.imgur.com/e0ulwt6.jpg): failed to open stream: HTTP request failed! HTTP/1.1 429 Unknown Error
in /var/www/vhosts/webit.ai/httpdocs/framework/libraries/x-http.php on line 62
"\n\n\n \n \n \n 403 Forbidden<\/title>\n \n<\/head>\n \n\n \n Server Error<\/h1>\n 403<\/div>\n Forbidden<\/h2>\n
You do not have permission to access this document.<\/p>\n
\n That's what you can do<\/p>\n
Here is another example of a drawn picture containing sheer but safe clothing. Webit Machine detects "drawing" type correctly and classifies it as "safe".
"\n\n\n \n \n \n 403 Forbidden<\/title>\n \n<\/head>\n \n\n \n Server Error<\/h1>\n 403<\/div>\n Forbidden<\/h2>\n
You do not have permission to access this document.<\/p>\n
\n That's what you can do<\/p>\n
More examples with additional details are available in the "Examples" documentation section.
Available endpoints
The NSFW Classification API has two major endpoints: "Trending" and "Search". Examples, roles and parameters explanations are extensively documented in the upcoming paragraphs.
Actively used by
63+ customers since October, 2020.
Legal
Privacy policy
Terms of Service
External pages
Facebook
Twitter
Github
2025 © Webit Inc.