Developing a dataflow for Stateless NiFi

Learn about the recommended process of building a dataflow that you can deploy with the Stateless NiFi Sink or Source connectors. This process involves building and designing a parameterized dataflow within a Process Group and then downloading the dataflow as a flow definition.

The general steps for building a dataflow are identical for both source and sink connector flows. For ease of understanding a dataflow example for a simple MQTT Source and MQTT Sink connector is provided without going into details (what processors to use, what parameters and properties to set, what relationships to define and so on).

  1. Access the NiFi UI.
  2. Create a new Process Group.
  3. Right-click the Process Group and select Configure.
  4. Create a new Parameter Context and assign it to the process group:
    1. Select Process Group Parameter Context > Create new parameter context...
    2. Enter a name for the Parameter Context.
    3. Click Apply.
  5. Click Apply and then click OK.
  6. Close the configuration dialog.
  7. Double-click the Process Group you created.
  8. Design and parameterize your dataflow.
    For example, you can build a dataflow that you deploy as a source connector. The following is an example dataflow for a simple MQTT source connector. It consists of a ConsumeMQTT Processor and an output port (representing a Kafka topic).

    Alternatively, you can also build a dataflow to deploy as a sink connector. The following is an example dataflow for a simple MQTT sink connector. It consists of an input port (representing a Kafka topic), a PublishMQTT Processor, and two output ports.

    In the case of this example, a Failure output port is required. This is done so that in case the destination is not available, the session can be rolled back and the Kafka message can be declined. The Failure port is specified in the configuration of the connector when deploying it through SMM.

    When designing and building your dataflow, ensure that you update your Parameter Context and reference the parameters in your components. For example:

    For more information on building dataflows, parameters, and referencing parameters see the Flow Management library.

  9. Exit the Process Group.
  10. Right-click the Process Group and select Download flow definition.
The dataflow is downloaded as a flow definition (JSON file).
Deploy the dataflow as a Kafka Connect connector using the Stateless NiFi Source or Sink connectors. Continue with Deploying a dataflow using stateless NiFi.