LOS GATOS, Calif., May 25, 2017 /PRNewswire/ -- Nventify, Inc. (www.nventify.com), the software development company behind Imagizer Media Engine, today announced the launch of its new "Not Safe For Work" image detection filter that has the capability to flag images into NSFW and SFW categories.
NSFW Filter works in real time detecting pornographic content under one second or faster. While many NSFW detection services use humans for image filtering and detection, Imagizer Engine implements algorithmic approach that significantly speeds up NSFW detection and image processing and can be launched from imagizer.com within minutes.
Using "Not Safe For Work" filter will enable developers to:
Scan images for nudity in your app or website
Label images as child safe or potentially not safe
Maintain appropriate App rating in App stores
Alert your content moderation team to images that violate your Terms of Service
The API docs can be accessed here: http://docs.imagizer.com/#image-recognition
About Imagizer Media Engine:
Imagizer Media Engine is a critical cloud component that accelerates media delivery to users' mobile Apps or Webpages by dynamically manipulating images in real time fitting devices' screen resolutions. Imagizer is compatible with all common image formats such as JPG, PNG, WEBP and GIF. For more info, visit: http://www.nventify.com
Kristina Tyshchenko, Nventify, 408-612-3954, email@example.com
To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/nventify-announces-not-safe-for-work-image-filter-300463541.html