gcloud alpha ml vision detect-safe-search - detect explicit content in an image
gcloud alpha ml vision detect-safe-search IMAGE_PATH [--model-version=MODEL_VERSION; default="builtin/stable"] [GCLOUD_WIDE_FLAG ...]
(ALPHA) Safe Search Detection detects adult content, violent content, medical content and spoof content in an image.
To detect adult content, violent content, medical content and spoof content in an image 'gs://my_bucket/input_file':
$ gcloud alpha ml vision detect-safe-search gs://my_bucket/input_file
- IMAGE_PATH
Path to the image to be analyzed. This can be either a local path or a URL. If you provide a local file, the contents will be sent directly to Google Cloud Vision. If you provide a URL, it must be in Google Cloud Storage format (gs://bucket/object) or an HTTP URL (http://... or https://...)
- --model-version=MODEL_VERSION; default="builtin/stable"
Model version to use for the feature. MODEL_VERSION must be one of: builtin/latest, builtin/stable.
These flags are available to all commands: --access-token-file, --account, --billing-project, --configuration, --flags-file, --flatten, --format, --help, --impersonate-service-account, --log-http, --project, --quiet, --trace-token, --user-output-enabled, --verbosity.
Run $ gcloud help for details.
This command uses the vision/v1 API. The full documentation for this API can be found at: https://cloud.google.com/vision/
This command is currently in alpha and might change without notice. If this command fails with API permission errors despite specifying the correct project, you might be trying to access an API with an invitation-only early access allowlist. These variants are also available:
$ gcloud ml vision detect-safe-search
$ gcloud beta ml vision detect-safe-search