Learn about how to create Airflow jobs using Cloudera Data Engineering
(CDE).
An Airflow job in Cloudera Data Engineering consists of an Airflow DAG file and
various optional resources. Jobs can be run on demand or scheduled.
In the Cloudera Data Platform (CDP) management console, click the
Data Engineering tile and click
Overview .
In the CDE Services column, select the service that
contains the virtual cluster that you want to create a job for.
In the Virtual Clusters column, locate the virtual
cluster that you want to use and click the View Jobs
icon.
In the left navigation menu, click Jobs .
Click the Create Job button.
Provide the Job Details:
Select Airflow for the job type. The available
fields on the user interface updates automatically.
Specify the Name .
Click the File option and select the way you
want to provide the DAG file. Using the File
option, you can do the following:
Upload the DAG file to a new resource.
Use the DAG file from a previously created resource.
Upload any other resources required for the job.
For more information about the Editor, see Creating an Airflow DAG using the
Pipeline UI .
If you do not want to run the job immediately, click the Create and
Run drop-down menu and then click Create .
Otherwise, click Create and Run to run the job
immediately.
note
If you select the Create and
Run option from the Create and Run
drop-down menu, the DAG file must not be paused while creating and provide a
False value to the
is_paused_upon_creation field.