Linking a workload secret to the Cloudera Data Engineering Spark Job definitions using API

Once you've created a workload secret, you can link it to the Cloudera Data Engineering (CDE) Spark Job definitions that you created using API.

  1. Use 1 or more workload-credentials in a new Spark job. For example, workload-cred-1 and workload-cred-2 in the job below:
    curl -k -X "POST" 'https://<dex-vc-host>/dex/api/v1/jobs' \
      -H 'Accept: application/json' \
      -H 'Connection: keep-alive' \
      -H 'Content-Type: application/json' \
      -H "Authorization: Bearer ${CDE_TOKEN}" \
      --data '{
        "name": "test-workload-job",
        "spark": {
            "className": "org.apache.spark.examples.SparkPi",
            "file": "spark-examples_2.11-2.4.8.7.2.12.0-195.jar"
        },
        "mounts": [
            {
                "resourceName": "spark-jar",
                "dirPrefix": ""
            }
        ],
        "type": "spark",
        "workloadCredentials": ["workload-cred-1", "workload-cred-2"]
    }'
    
  2. Use 1 or more workload-credentials in an existing Spark job. For example, workload-cred-1 and workload-cred-2 in the job below:
    curl -k -X "PATCH" 'https://<dex-vc-host>/dex/api/v1/jobs/test-workload-job' \
      -H 'Accept: application/json' \
      -H 'Connection: keep-alive' \
      -H 'Content-Type: application/json' \
      -H "Authorization: Bearer ${CDE_TOKEN}" \
      --data '{
            "workloadCredentials": ["workload-cred-1", "workload-cred-2"]
    }'