Image Moderation API

Image Moderation API

Leverage machine learning to moderate your user-generated content and keep your community safe.

What is the Image Moderation API?

Gigadrive Network's Image Moderation API is a machine learning-based API that can be used to moderate images. It can be used to detect adult content, suggestive content, and more.

You can use this HTTP API to help protect your application's community from harmful content through a simple API call.

Common use cases for the Image Moderation API include:

  • Moderate user-generated content. Use the Image Moderation API to moderate user-generated content, such as profile pictures, posts, and more.
  • Protect your community. Use the Image Moderation API to protect your community from harmful content, such as adult content, suggestive content, and more.

Some of the benefits of using the Image Moderation API include:

  • Easy to use. The Image Moderation API is easy to use and can be integrated into your application in minutes.
  • Highly accurate. The Image Moderation API is highly accurate and can detect adult content, suggestive content, and more with high accuracy.
  • Fast. The Image Moderation API is fast and can moderate images in real-time.
  • Scalable. The Image Moderation API is highly scalable and can moderate millions of images per day.

These are just some of the benefits of using the Image Moderation API. If you plan to allow user-generated content in your application, the Image Moderation API is a must-have.

How to use the Image Moderation API

The Image Moderation API is easy to use and can be integrated into your application in minutes.


Categories

Images will be classified into different contents, depending on the content of the image. The following table lists the categories that the API can classify images into.

Identifier Name Description
C-100 Neutral

Images that contain no adult content.

This is the default category.

C-101 Drawing

Images that contain no adult content.

Drawn images that contain no adult content.

C-200 Suggestive

Images that contain suggestive content.

Images classified with this category contain suggestive content. This includes images that contain partial nudity, but not full nudity (e.g. underwear, swimsuits, etc.).

C-300 Drawn Adult Content

Images that contain adult content.

Drawn images that contain adult content. Images classified with this category contain adult content. This includes images that contain content such as sexual acts, partial nudity, and full nudity.

C-301 Realistic Adult Content

Images that contain adult content.

Realistic looking images that contain adult content. Images classified with this category contain adult content. This includes images that contain content such as sexual acts, partial nudity, and full nudity.


POST/image-moderation/classification

Create an image classification

Required API Key permission: image-moderation:classify

This endpoint classifies an image into one of multiple categories. For each category, the API returns a score between 0 and 1, indicating the probability that the image belongs to that category. Scores closer to 1 indicate higher probability.

Required attributes

  • Name
    uri
    Type
    string
    Description

    The full location of the image to classify.

Request

GET
/image-moderation/classification
curl https://api.gigadrive.network/image-moderation/classification \
  -H "Content-Type: application/json" \
  -d '{"uri": "https://example.com/image.jpg"}'

Response (HTTP 200)

{
    "categories": {
        "C-100": 0.9860316514968872,
        "C-200": 0.009048506617546082,
        "C-101": 0.004611073527485132,
        "C-301": 0.00028566402033902705,
        "C-300": 0.000023128204702516086
    },
    "withholdingRecommended": false
}