Creating a NiFi Flow to Stream Events to HCP
This section provides instructions to create a flow to capture events from the new data source and push them into HCP.
Drag the first icon on the toolbar (the processor icon) to your workspace.
NiFi displays the Add Processor dialog box.
Select the TailFile type of processor and click Add.
NiFi displays a new TailFile processor.
Right-click the processor icon and select Configure to display the Configure Processor dialog box.
Add another processor by dragging the Processor icon to the main window.
Select the PutKafka type of processor and click Add.
Right-click the processor and select Configure.
In the Settings tab, change the name to
Stream to Metron
and then click the relationship check boxes for failure and success.In the Properties tab, set the following three properties:
Known Brokers: $KAFKA_HOST:6667
Topic Name: $DATAPROCESSOR
Client Name: nifi-$DATAPROCESSOR
Create a connection by dragging the arrow from the Ingest $DATAPROCESSOR Events processor to the Stream to Metron processor.
NiFi displays a Create Connection dialog box.
Click
Add
to accept the default settings for the connection.Press the Shift key and draw a box around both parsers to select the entire flow; then click the play button (green arrow).
You should see all of the processor icons turn into green arrows.
Click (Start button in the Operate panel.
Generate some data using the new data processor client.
You should see metrics on the processor of data being pushed into Metron.
Look at the Storm UI for the parser topology and you should see tuples coming in.
After about five minutes, you should see a new Elastic Search index called $DATAPROCESSOR_index* in the Elastic Admin UI.
For more information about creating a NiFi data flow, see the NiFi documentation.