Deploying the Azure Event Hub to ADLS ReadyFlow

Learn how to use the Deployment wizard to deploy the Azure Event Hub to ADLS ReadyFlow using the information you collected using the prerequisites check list.

The CDF Catalog is where you manage the flow definition lifecycle, from initial import, to versioning, to deploying a flow definition.

  1. In DataFlow, from the left navigation pane, click Catalog.
    Flow definitions available for you to deploy are displayed, one definition per row.
  2. Launch the Deployment wizard.
    1. Click the row to display the flow definition details and versions.
    2. Click a row representing a flow definition version to display flow definition version details and the Deploy button.
    3. Click Deploy to launch the Deployment wizard.
  3. Select the environment to which you want to deploy this version of your flow definition, and click Continue.
  4. In the Overview, give your flow deployment a unique name.

    You can use this name to distinguish between different versions of a flow definition, flow definitions deployed to different environments, and similar.

  5. In NiFi Configuration:
    1. Select a NiFi Runtime Version for your flow deployment. Cloudera recommends that you always use the latest available version, if possible.
    2. Autostart Behavior is on by default, allowing your flow to start automatically after successful deployment. You can clear selection if you do not want the automatic start.
  6. In Parameters, specify parameter values like connection strings, usernames and similar, and upload files like truststores, and similar.

    For parameters specific to this ReadyFlow, see the Example with the configuration parameters table below.

  7. Specify your Sizing & Scaling configurations.
    NiFi node sizing
    You can adjust the size of your cluster from Extra Small to Large
    Number of NiFi nodes
    • You can set whether you want to automatically scale your cluster depending on resource demands. When you enable autoscaling, the minimum NiFi nodes are used for initial size and the workload scales up or down depending on resource demands.
    • You can set the number of nodes from 1 to 32.
  8. In Key Performance Indicators, you can set up your metrics system with specific KPIs to track the performance of a deployed flow. You can also define when and how to receive alerts about the KPI metrics tracking.

    See Working with KPIs for more information about the KPIs available and how you can monitor them.

  9. Review the summary of the information you provided in the Deployment wizard and make any necessary edits by clicking Previous. When you are finished, complete your flow deployment by clicking Deploy.

Once you click Deploy, you are redirected to the Alerts tab in the Flow Deployment Detail view where you can track its progress.

For the Azure Event Hub to ADLS Readyflow, the following parameters are required. Use the information you collected in the Meeting the prerequisites section.

Azure Event Hub to ADLS ReadyFlow configuration parameters
Parameter Name Description Example
ADLS File System Specify the name of the ADLS data container you want to write to. The full path will be constructed from: abfs://{ADLS File System}@#{ADLS Storage Account}.dfs.core.windows.net/#{ADLS Path}/${Kafka.topic}
ADLS Path Specify the path within the ADLS data container where you want to write to without any leading characters. The full path will be constructed from: abfs://{ADLS File System}@#{ADLS Storage Account}.dfs.core.windows.net/#{ADLS Path}/${Kafka.topic}
ADLS Storage Account Specify the storage account name you want to write to. The full ADLS data container path will be constructed from: abfs://{ADLS File System}@#{ADLS Storage Account}.dfs.core.windows.net/#{ADLS Path}/${Kafka.topic}
CDP Workload User Specify the CDP machine user or workload username that you want to use to authenticate to Kafka and the object store. Ensure this user has the appropriate access rights in Ranger for the Kafka topic and Ranger or IDBroker for object store access.
CDP Workload User Password Specify the password of the CDP machine user or workload user you are using to authenticate against Kafka and the object store.
CSV Delimiter If your source data is CSV, specify the delimiter here.
Data Output Format Specify the desired format for your output data. You can select from
  • CSV
  • JSON
  • AVRO
with this ReadyFlow.
Event Hub Access Policy Name Specify the Access Policy Name that this flow should use. The full path for the event hub endpoint will be constructed from sb://#{Event Hub Namespace}.#{Event Hub Service Bus Endpoint}/;SharedAccessKeyName=#{Event Hub Access Policy Name};SharedAccessKey=#{Event Hub Access Primary Key}
Event Hub Access Primary Key Specify the Primary Key that allows clients to use the Access Policy that you provided earlier. The full path for the event hub endpoint will be constructed from sb://#{Event Hub Namespace}.#{Event Hub Service Bus Endpoint}/;SharedAccessKeyName=#{Event Hub Access Policy Name};SharedAccessKey=#{Event Hub Access Primary Key}
Event Hub Consumer Group Specify the Event Hub Consumer Group you want to use with this flow. Any consumer group other than $Default needs to be created in Event Hub first.
Event Hub Instance Name Specify the Event Hub Instance Name inside the Event Hub Namespace you want to use.
Event Hub Namespace Specify the Event Hub Namespace which contains the Event Hub instance you want to use. The full path for the event hub endpoint will be constructed from sb://#{Event Hub Namespace}.#{Event Hub Service Bus Endpoint}/;SharedAccessKeyName=#{Event Hub Access Policy Name};SharedAccessKey=#{Event Hub Access Primary Key}
Event Hub Partitions Count Specify the number of partitions that the Event Hub has. Only this number of partitions will be used, so it is important to ensure that if the number of partitions changes that this value be updated. Otherwise, some messages may not be consumed.
Event Hub Service Bus Endpoint

Specify the Event Hub Service Bus Endpoint.

The default value is .servicebus.windows.net

The full path for the event hub endpoint will be constructed from sb://#{Event Hub Namespace}.#{Event Hub Service Bus Endpoint}/;SharedAccessKeyName=#{Event Hub Access Policy Name};SharedAccessKey=#{Event Hub Access Primary Key}

Filter Rule Specify the filter rule expressed in SQL to filter streaming events for the destination object store. Records matching the filter will be written to the destination object store. The default value forwards all records.
Schema Text Specify the Avro-formatted schema to be used for the source event hub data.