gcloud alpha dataproc workflow-templates add-job spark - add a Spark job to the workflow template
gcloud alpha dataproc workflow-templates add-job spark --step-id=STEP_ID (--workflow-template=WORKFLOW_TEMPLATE : --region=REGION) [--archives=[ARCHIVE,...]] [--driver-log-levels=[PACKAGE=LEVEL,...]] [--files=[FILE,...]] [--jars=[JAR,...]] [--labels=[KEY=VALUE,...]] [--properties=[PROPERTY=VALUE,...]] [--properties-file=PROPERTIES_FILE] [--start-after=STEP_ID,[STEP_ID,...]] [--class=MAIN_CLASS --jar=MAIN_JAR] [GCLOUD_WIDE_FLAG ...] [-- JOB_ARGS ...]
(ALPHA) Add a Spark job to the workflow template.
To add a Spark job with files 'file1' and 'file2' to a the workflow template 'my-workflow-template' in region 'us-central1' with step-id 'my-step-id' , run:
$ gcloud alpha dataproc workflow-templates add-job spark \ --step-id=my-step_id --files="file1,file2" \ --workflow-template=my-workflow-template --region=us-central1
- [-- JOB_ARGS ...]
Arguments to pass to the driver.
The '--' argument must be specified between gcloud specific args on the left and JOB_ARGS on the right.
- --step-id=STEP_ID
The step ID of the job in the workflow template.
- Template resource - The name of the workflow template to add job to. The
arguments in this group can be used to specify the attributes of this resource. (NOTE) Some attributes are not given arguments in this group but can be set in other ways. To set the project attribute:
- —
provide the argument --workflow-template on the command line with a fully specified name;
- —
provide the argument --project on the command line;
- —
set the property core/project.
This must be specified.
- --workflow-template=WORKFLOW_TEMPLATE
ID of the template or fully qualified identifier for the template. To set the template attribute:
provide the argument --workflow-template on the command line.
This flag argument must be specified if any of the other arguments in this group are specified.
- --region=REGION
Dataproc region for the template. Each Dataproc region constitutes an independent resource namespace constrained to deploying instances into Compute Engine zones inside the region. Overrides the default dataproc/region property value for this command invocation. To set the region attribute:
provide the argument --workflow-template on the command line with a fully specified name;
provide the argument --region on the command line;
set the property dataproc/region.
- --archives=[ARCHIVE,...]
Comma separated list of archives to be extracted into the working directory of each executor. Must be one of the following file formats: .zip, .tar, .tar.gz, or .tgz.
- --driver-log-levels=[PACKAGE=LEVEL,...]
List of package to log4j log level pairs to configure driver logging. For example: root=FATAL,com.example=INFO
- --files=[FILE,...]
Comma separated list of files to be placed in the working directory of both the app master and executors.
- --jars=[JAR,...]
Comma separated list of jar files to be provided to the executor and driver classpaths.
- --labels=[KEY=VALUE,...]
List of label KEY=VALUE pairs to add.
Keys must start with a lowercase character and contain only hyphens (-), underscores (_), lowercase characters, and numbers. Values must contain only hyphens (-), underscores (_), lowercase characters, and numbers.
- --properties=[PROPERTY=VALUE,...]
List of key value pairs to configure Spark. For a list of available properties, see: https://spark.apache.org/docs/latest/configuration.html#available-properties.
- --properties-file=PROPERTIES_FILE
Path to a local file or a file in a Cloud Storage bucket containing configuration properties for the job. The client machine running this command must have read permission to the file.
Specify properties in the form of property=value in the text file. For example:
# Properties to set for the job: key1=value1 key2=value2 # Comment out properties not used. # key3=value3
If a property is set in both --properties and --properties-file, the value defined in --properties takes precedence.
- --start-after=STEP_ID,[STEP_ID,...]
(Optional) List of step IDs to start this job after.
- --class=MAIN_CLASS
The class containing the main method of the driver. Must be in a provided jar or jar that is already on the classpath
- --jar=MAIN_JAR
The HCFS URI of jar file containing the driver jar.
These flags are available to all commands: --access-token-file, --account, --billing-project, --configuration, --flags-file, --flatten, --format, --help, --impersonate-service-account, --log-http, --project, --quiet, --trace-token, --user-output-enabled, --verbosity.
Run $ gcloud help for details.
This command is currently in alpha and might change without notice. If this command fails with API permission errors despite specifying the correct project, you might be trying to access an API with an invitation-only early access allowlist. These variants are also available:
$ gcloud dataproc workflow-templates add-job spark
$ gcloud beta dataproc workflow-templates add-job spark