Photo Guardian
Welcome to Photo Guardian! You've been directed to this page since you are not currently signed in. To begin using Photo Guardian, either log in to an existing account, or create a new one.
Introduction
Photo Guardian is an entirely automatic content moderation service that detects nudity and other objectionable content in images. Customers looking for a quick set-up can configure Photo Guardian to automatically censor content (or reject images entirely), while developers with more advanced needs can retrieve highly specific computer-readable data with bounding-boxes around objectionable content.
To learn more about how to integrate your application or service with Photo Guardian, click the "Documentation" button at the top of the page.
Pricing
Photo Guardian is used with user-configured "targets", which are endpoints configured to take a certain action on a provided image (analyze, censor, or filter). Each of your services/applications will use one or more targets to automatically manage content moderation. Each action consumes a certain number of credits.
Analyze - 1 credits
Returns computer-readable information about objectionable content in the image
Returns JSON of image regions that contain objectionable content
Filter - 1 credits
Allows or denies photos as a whole depending on their content
Returns JSON with a boolean indicating if the image is safe or not
Censor - 2 credits
Automatically censors objectionable content in photos
Returns JSON with a censored image encoded in base64
The current cost for 10,000 credits is $15.
Capabilities
Photo Guardian's custom model is capable of detecting the following classes. Classes indicated with a star are generally considered not-safe-for-work, and are enabled by default. The exact classes considered by each target can be configured individually.
This section contains names for objectionable content that are potetially unprofessional. Would you like to display them?
No objectionable images will be shown.
Show NSFW