NAME

gcloud alpha ml vision detect-safe-search - detect explicit content in an image

SYNOPSIS

gcloud alpha ml vision detect-safe-search IMAGE_PATH [--model-version=MODEL_VERSION; default="builtin/stable"] [GCLOUD_WIDE_FLAG ...]

DESCRIPTION

(ALPHA) Safe Search Detection detects adult content, violent content, medical content and spoof content in an image.

EXAMPLES

To detect adult content, violent content, medical content and spoof content in an image 'gs://my_bucket/input_file':

$ gcloud alpha ml vision detect-safe-search gs://my_bucket/input_file

POSITIONAL ARGUMENTS

IMAGE_PATH

Path to the image to be analyzed. This can be either a local path or a URL. If you provide a local file, the contents will be sent directly to Google Cloud Vision. If you provide a URL, it must be in Google Cloud Storage format (gs://bucket/object) or an HTTP URL (http://... or https://...)

FLAGS

--model-version=MODEL_VERSION; default="builtin/stable"

Model version to use for the feature. MODEL_VERSION must be one of: builtin/latest, builtin/stable.

GCLOUD WIDE FLAGS

These flags are available to all commands: --access-token-file, --account, --billing-project, --configuration, --flags-file, --flatten, --format, --help, --impersonate-service-account, --log-http, --project, --quiet, --trace-token, --user-output-enabled, --verbosity.

Run $ gcloud help for details.

API REFERENCE

This command uses the vision/v1 API. The full documentation for this API can be found at: https://cloud.google.com/vision/

NOTES

This command is currently in alpha and might change without notice. If this command fails with API permission errors despite specifying the correct project, you might be trying to access an API with an invitation-only early access allowlist. These variants are also available:

$ gcloud ml vision detect-safe-search

$ gcloud beta ml vision detect-safe-search