Creating and updating Apache Spark jobs using the CLI

The following example demonstrates how to create a Spark application in Cloudera Data Engineering (CDE) using the command line interface (CLI).

Make sure that you have downloaded the CLI client. For more information, see Using the Cloudera Data Engineering command line interface .
  1. Run the cde job create command as follows:
    cde job create --application-file <path_to_application_jar> --class <application_class> [--default-variable name=value] --name <job_name> --num-executors <num_executors> --type spark
    To see the full command syntax and supported options, run cde job create --help.
    With [--default-variable] flags you can replace strings in job values. Currently the supported fields are:
    • Spark application name
    • Spark arguments
    • Spark configurations
    For a variable flag name=value any substring {{{name}}} in the value of the supported field gets replaced with value. These can be overriden by the [--variable] flag during the job run.
  2. Run cde job describe to verify that the job was created:
    cde job describe --name <job_name>
  3. If you want to update the job configuration, use the cde job update command.
    For example, to change the number of executors:
    cde job update --name test_job --num-executors 15
    To see the full command syntax and supported options, run cde job update --help.
  4. To verify the updated configuration, run cde job describe again:
    cde job describe --name <job_name>