Automating data pipelines using Apache Airflow in Cloudera Data Engineering

Cloudera Data Engineering (CDE) enables you to automate a workflow or data pipeline using Apache Airflow Python DAG files. Each CDE virtual cluster includes an embedded instance of Apache Airflow. You can also use CDE with your own Airflow deployment. CDE on CDP Private Cloud currently supports only the CDE job run operator.

The following instructions are for using the Airflow service provided with each CDE virtual cluster. For instructions on using your own Airflow deployment, see Using the Cloudera provider for Apache Airflow.

  1. Create an Airflow DAG file in Python. Import the CDE operator and define the tasks and dependencies.
    Here is a complete DAG file:
    from dateutil import parser
    from datetime import datetime, timedelta
    from datetime import timezone
    from airflow import DAG
    from cloudera.airflow.providers.operators.cde import CdeRunJobOperator
    
    
    default_args = {
        'owner': 'psherman',
        'retry_delay': timedelta(seconds=5),
        'depends_on_past': False,
        'start_date':datetime(2024, 2, 10, tz="UTC"),
    }
    
    example_dag = DAG(
        'airflow-pipeline-demo',
        default_args=default_args, 
        schedule_interval='@daily', 
        catchup=False, 
        is_paused_upon_creation=False
    )
    
    ingest_step1 = CdeRunJobOperator(
        connection_id='cde-vc01-dev',
        task_id='ingest',
        retries=3,
        dag=example_dag,
        job_name='etl-ingest-job'
    )
    
    prep_step2 = CdeRunJobOperator(
        task_id='data_prep',
        job_name='insurance-claims-job'
    )
    
    
    ingest_step1 >> prep_step2
    

    Here are some examples of things you can define in the DAG file:

    CDE job run operator
    Use CdeRunJobOperator to specify a CDE job to run. This job must already exist in the virtual cluster specified by the connection_id. If no connection_id is specified, CDE looks for the job in the virtual cluster where the Airflow job runs.
    from cloudera.cdp.airflow.operators.cde_operator import CdeRunJobOperator
    ...
    ingest_step1 = CdeRunJobOperator(
        connection_id='cde-vc01-dev',
        task_id='ingest',
        retries=3,
        dag=example_dag,
        job_name='etl-ingest-job'
    )
    
    Email Alerts
    Add the following parameters to the DAG default_args to send email alerts for job failures or missed service-level agreements or both.
    'email_on_failure': True,
    'email': 'abc@example.com',
    'email_on_retry': True,
    'sla': timedelta(seconds=30)                
    Task dependencies
    After you have defined the tasks, specify the dependencies as follows:
    ingest_step1 >> prep_step2

    For more information on task dependencies, see Task Dependencies in the Apache Airflow documentation.

    For a tutorial on creating Apache Airflow DAG files, see the Apache Airflow documentation.
  2. Create a CDE job.
    1. In the Cloudera Data Platform (CDP) console, click the Data Engineering tile. The CDE Home page displays.
    2. In the left navigation menu click Jobs. The Jobs page is displayed.
    3. Click Create Job. The Job Details page is displayed.
    4. Select the Airflow job type.
    5. Name: Provide a name for the job.
    6. DAG File: Use an existing file or add a DAG file to an existing resource or create a resource and upload it.
      1. Select from Resource: Click Select from Resource to select a DAG file from an existing resource.
      2. Upload: Click Upload to upload a DAG file to an existing resource or to a new resource that you can create by selecting Create a resource from the Select a Resource dropdown list. Specify the resource name and upload the DAG file to it.
  3. Click Create and Run to create the job and run it immediately, or click the dropdown button and select Create to create the job.