Deploying the Confluent Cloud to S3/ADLS ReadyFlow

Learn how to use the Deployment wizard to deploy the Confluent Cloud to S3/ADLS ReadyFlow using the information you collected using the prerequisites check list.

The CDF Catalog is where you manage the flow definition lifecycle, from initial import, to versioning, to deploying a flow definition.

  1. In DataFlow, from the left navigation pane, click Catalog.
    Flow definitions available for you to deploy are displayed, one definition per row.
  2. Launch the Deployment wizard.
    1. Click the row to display the flow definition details and versions.
    2. Click a row representing a flow definition version to display flow definition version details and the Deploy button.
    3. Click Deploy to launch the Deployment wizard.
  3. Select the environment to which you want to deploy this version of your flow definition, and click Continue.
  4. In the Overview, give your flow deployment a unique name.

    You can use this name to distinguish between different versions of a flow definition, flow definitions deployed to different environments, and similar.

  5. In NiFi Configuration:
    1. Select a NiFi Runtime Version for your flow deployment. Cloudera recommends that you always use the latest available version, if possible.
    2. Autostart Behavior is on by default, allowing your flow to start automatically after successful deployment. You can clear selection if you do not want the automatic start.
  6. In Parameters, specify parameter values like connection strings, usernames and similar, and upload files like truststores, and similar.

    For parameters specific to this ReadyFlow, see the Example with the configuration parameters table below.

  7. Specify your Sizing & Scaling configurations.
    NiFi node sizing
    You can adjust the size of your cluster from Extra Small to Large
    Number of NiFi nodes
    • You can set whether you want to automatically scale your cluster depending on resource demands. When you enable autoscaling, the minimum NiFi nodes are used for initial size and the workload scales up or down depending on resource demands.
    • You can set the number of nodes from 1 to 32.
  8. In Key Performance Indicators, you can set up your metrics system with specific KPIs to track the performance of a deployed flow. You can also define when and how to receive alerts about the KPI metrics tracking.

    See Working with KPIs for more information about the KPIs available and how you can monitor them.

  9. Review the summary of the information you provided in the Deployment wizard and make any necessary edits by clicking Previous. When you are finished, complete your flow deployment by clicking Deploy.

Once you click Deploy, you are being redirected to the Alerts tab in the detail view for the deployment where you can track its progress.

Confluent Cloud to S3/ADLS data flow. You have collected this information in the Meeting the pre-requisites step.

Table 1. Confluent Cloud to S3/ADLS ReadyFlow configuration parameters
Parameter Name Description Example
CDP Workload User Specify the CDP machine user or workload username that you want to use to authenticate to the object stores. Ensure this user has the appropriate access rights to the object store locations in Ranger or IDBroker.
CDP Workload User Password Specify the password of the CDP machine user or workload user you are using to authenticate against the object stores (via IDBroker).
CSV Delimiter If your source data is CSV, specify the delimiter here.
Data Input Format

Specify the format of your input data. You can use "CSV", "JSON" or "AVRO" with this ReadyFlow.

  • CSV

  • JSON

  • AVRO

Data Output Format Specify the desired format for your output data. You can use "CSV", "JSON" or "AVRO" with this ReadyFlow.
Destination S3 or ADLS Path Specify the name of the destination S3 or ADLS path you want to write to. Make sure that the path starts with "/".
Destination S3 or ADLS Storage Location Specify the name of the destination S3 bucket or ADLS Container you want to write to.

For S3, enter a value in the form: s3a://[Destination S3 Bucket]

For ADLS, enter a value in the form: abfs://[Destination ADLS File System]@[Destination ADLS Storage Account].dfs.core.windows.net
Filter Rule

Specify the filter rule expressed in SQL to filter streaming events for the destination database. Records matching the filter will be written to the destination database. The default value forwards all records.

Kafka Broker Endpoint

Specify the Kafka bootstrap server.

Kafka Client API Key

Specify the API Key to connect to the Kafka cluster.

Kafka Client API Secret

Specify the API Secret to connect to the Kafka cluster.

Kafka Consumer Group Id

The name of the consumer group used for the source topic you are consuming from.

Kafka Source Topic

Specify a topic name that you want to read from.

Kafka Schema Name

Specify the schema name to be looked up in the Confluent Schema Registry for the Source Kafka Topic.

Kafka Source Topic Specify the topic name that you want to read from.
Schema Registry Client Key Specify the API Key to connect to the Confluent Schema Registry.
Schema Registry Client Secret Specify the API Secret to connect to the Confluent Schema Registry.
Schema Registry Endpoint Specify the Schema Registry API endpoint.