PornFlagData

GoogleApi.ContentWarehouse.V1.Model.PornFlagData


Table of Contents ▼

Jump to a specific part of the page:

Description

A protocol buffer to store the url, referer and porn flag for a url. and an optional image score. Next available tag id: 51.

Attributes List

This module has the following attributes (case-insensitive ascending order):

View Attributes

Attributes

  1. debugInfo (type: list(GoogleApi.ContentWarehouse.V1.Model.ImagePornDebugInfo), default: nil)
    - DebugInfo stores debug information from the overall classifier. This allows for instance to update counters related to blacklisting without running the full classifier again.
  2. finalOffensiveScore (type: number(), default: nil)
    - Final offensive score based on image salient terms and image OCR vulgar and offensive scores.
  3. finalViolenceScore (type: number(), default: nil)
    - Final violence score based on some image signals (brain pixel score, co-clicked images violence score, navboost queries score, etc.).
  4. finalViolenceScoreVersion (type: String.t, default: nil)
    - A string that indicates the version of SafeSearch classifier used to compute final_violence_score.
  5. internalSignals (type: GoogleApi.ContentWarehouse.V1.Model.SafesearchInternalImageSignals, default: nil)
    - A proto that stores SafeSearch internal signals that are not exported to clients. SafeSearch team does not provide any guarantees about the presence or the semantics of these signals in the future.
  6. numberFaces (type: integer(), default: nil)
    - number of faces
  7. ocrAnnotation (type: GoogleApi.ContentWarehouse.V1.Model.ImageSafesearchContentOCRAnnotation, default: nil)
    - Information about image OCR text. For details see image/safesearch/content/public/ocr_annotation.proto.
  8. offensiveSymbolDetection (type: GoogleApi.ContentWarehouse.V1.Model.ImageSafesearchContentOffensiveSymbolDetection, default: nil)
    - QuimbyCongas-based detection of offensive symbols in the image (currently swastika and Nazi yellow badge).
  9. photodnaHash (type: String.t, default: nil)
    - Binary version of the PhotoDNA hash (144 bytes long). If not set (has_photodna_hash() == false) it means that it was not computed, if empty (has_photodna_hash() == true && photodna_hash() == "") it means that the computation failed (cannot be computed for images smaller than 50 x 50).
  10. pornWithHighConfidence (type: boolean(), default: nil)
    - This field is set to true when we are pretty confident that the image is porn (with higher precision than the img_porn_moderate restrict). In particular, it means that the image might be demoted for non-porn queries when SafeSearch is Off.
  11. qbstOffensiveScore (type: number(), default: nil)
    - QBST-based image offensive score, Navboost based
  12. qbstSpoofScore (type: number(), default: nil)
    - QBST-based image spoof score, Navboost based, unrelated to the pixel-based score in PornAnnotation.
  13. queryStats (type: GoogleApi.ContentWarehouse.V1.Model.ClassifierPornQueryStats, default: nil)
    - Query statistics from Navboost logs. For more details see classifier/porn/proto/image_porn_classifier_signals.proto.
  14. queryTextViolenceScore (type: number(), default: nil)
    - Aggregated navboost query violence score.
  15. referer (type: String.t, default: nil)
    - url of the referer page
  16. referrerCounts (type: GoogleApi.ContentWarehouse.V1.Model.ClassifierPornReferrerCounts, default: nil)
    - Information about referrers and their porn classification. For details see classifier/porn/proto/image_porn_classifier_signals.proto.
  17. semanticSexualizationScore (type: number(), default: nil)
    - Starburst-based score predicting sexualization level of the image.
  18. url (type: String.t, default: nil)
    - url of the image

Type

@type t() :: %GoogleApi.ContentWarehouse.V1.Model.PornFlagData{
debugInfo: [GoogleApi.ContentWarehouse.V1.Model.ImagePornDebugInfo.t()] | nil,
finalOffensiveScore: number() | nil,
finalViolenceScore: number() | nil,
finalViolenceScoreVersion: String.t() | nil,
internalSignals: GoogleApi.ContentWarehouse.V1.Model.SafesearchInternalImageSignals.t() | nil,
numberFaces: integer() | nil,
ocrAnnotation: GoogleApi.ContentWarehouse.V1.Model.ImageSafesearchContentOCRAnnotation.t() | nil,
offensiveSymbolDetection: GoogleApi.ContentWarehouse.V1.Model.ImageSafesearchContentOffensiveSymbolDetection.t() | nil,
photodnaHash: String.t() | nil,
pornWithHighConfidence: boolean() | nil,
qbstOffensiveScore: number() | nil,
qbstSpoofScore: number() | nil,
queryStats: GoogleApi.ContentWarehouse.V1.Model.ClassifierPornQueryStats.t() | nil,
queryTextViolenceScore: number() | nil,
referer: String.t() | nil,
referrerCounts: GoogleApi.ContentWarehouse.V1.Model.ClassifierPornReferrerCounts.t() | nil,
semanticSexualizationScore: number() | nil,
url: String.t() | nil
}

Function

@spec decode(struct(), keyword()) :: struct()

Data sourced from HexDocs : GoogleApi.ContentWarehouse.V1.Model.PornFlagData