GoogleApi.ContentWarehouse.V1.Model.ImageSafesearchContentBrainPornAnnotation
Table of Contents ▼
Jump to a specific part of the page:
Description
Don't change the field names. The names are used as sparse feature labels in client projects.
Attributes List
This module has the following attributes (case-insensitive ascending order):
Attributes
-
childScore
(type:number()
, default:nil
)
- The probability that the youngest person in the image is a child. -
csaiScore
(type:float()
, default:nil
)
- This score correlates with potential child abuse. Google confidential! -
csamA1Score
(type:number()
, default:nil
)
- Experimental score. Do not use. Google confidential! -
csamAgeIndeterminateScore
(type:number()
, default:nil
)
- Experimental score. Do not use. Google confidential! -
iuInappropriateScore
(type:number()
, default:nil
)
- This field contains the probability that an image is inappropriate for Images Universal, according to this policy: go/iupolicy. -
medicalScore
(type:number()
, default:nil
)
- -
pedoScore
(type:number()
, default:nil
)
- -
pornScore
(type:float()
, default:nil
)
- -
racyScore
(type:number()
, default:nil
)
- This score is related to an image being sexually suggestive. -
semanticSexualizationScore
(type:number()
, default:nil
)
- This score is related to racy/sexual images where scores have semantic meaning from 0 to 1. -
spoofScore
(type:number()
, default:nil
)
- -
version
(type:String.t
, default:nil
)
- -
violenceScore
(type:number()
, default:nil
)
- -
ytPornScore
(type:number()
, default:nil
)
- Deprecated, use porn_score instead. The most recent model version does not produce this anymore.
Type
@type t() :: %GoogleApi.ContentWarehouse.V1.Model.ImageSafesearchContentBrainPornAnnotation{
childScore: number() | nil,
csaiScore: float() | nil,
csamA1Score: number() | nil,
csamAgeIndeterminateScore: number() | nil,
iuInappropriateScore: number() | nil,
medicalScore: number() | nil,
pedoScore: number() | nil,
pornScore: float() | nil,
racyScore: number() | nil,
semanticSexualizationScore: number() | nil,
spoofScore: number() | nil,
version: String.t() | nil,
violenceScore: number() | nil,
ytPornScore: number() | nil
}
childScore: number() | nil,
csaiScore: float() | nil,
csamA1Score: number() | nil,
csamAgeIndeterminateScore: number() | nil,
iuInappropriateScore: number() | nil,
medicalScore: number() | nil,
pedoScore: number() | nil,
pornScore: float() | nil,
racyScore: number() | nil,
semanticSexualizationScore: number() | nil,
spoofScore: number() | nil,
version: String.t() | nil,
violenceScore: number() | nil,
ytPornScore: number() | nil
}
Function
@spec decode(struct(), keyword()) :: struct()Data sourced from HexDocs : GoogleApi.ContentWarehouse.V1.Model.ImageSafesearchContentBrainPornAnnotation