Start the data flow

When your flow is ready, you can begin ingesting data into Azure Data Lake Storage folders. Learn how to start your ADLS ingest data flow.

  1. To get your flow to work, select all the data flow components you want to start.
  2. Click the Start icon in the Actions toolbar.
    Alternatively, right-click a single component and choose Start from the context menu. Data should be read from Kafka and it should be written to your predefined Azure ADLS folder.

This example data flow has been created using the PutHDFS processor.

It is useful to check that data is running through the flow you have created without any errors.