gcloud ml vision detect-safe-search - detect explicit content in an image
gcloud ml vision detect-safe-search IMAGE_PATH [GCLOUD_WIDE_FLAG ...]
Safe Search Detection detects adult content, violent content, medical content and spoof content in an image.
To detect adult content, violent content, medical content and spoof content in an image 'gs://my_bucket/input_file':
$ gcloud ml vision detect-safe-search gs://my_bucket/input_file
- IMAGE_PATH
Path to the image to be analyzed. This can be either a local path or a URL. If you provide a local file, the contents will be sent directly to Google Cloud Vision. If you provide a URL, it must be in Google Cloud Storage format (gs://bucket/object) or an HTTP URL (http://... or https://...)
These flags are available to all commands: --access-token-file, --account, --billing-project, --configuration, --flags-file, --flatten, --format, --help, --impersonate-service-account, --log-http, --project, --quiet, --trace-token, --user-output-enabled, --verbosity.
Run $ gcloud help for details.
This command uses the vision/v1 API. The full documentation for this API can be found at: https://cloud.google.com/vision/
These variants are also available:
$ gcloud alpha ml vision detect-safe-search
$ gcloud beta ml vision detect-safe-search