When your flow is ready, you can begin ingesting data into Azure Data lake Storage
folders. Learn how to start your ADLS ingest data flow.
To get your flow to work, select all the data flow components you want to
start.
Click the Start icon in the Actions
toolbar.
Alternatively, right-click a single component and choose
Start from the context menu. Data should be read from Kafka and
it should be written to your predefined Azure ADLS folder.
This example data flow has been created using the PutHDFS
processor.
This example data flow has been created using the
PutAzureDataLakeStorage processor.
It is useful to check that data is running through the flow you have
created.