Integrating Kafka and Schema Registry Using NiFi Processors
If you are using an Ambari-managed HDF cluster with Schema Registry, NiFi, and Kafka installed, you can use NiFi Processors to integrate Schema Registry with Kafka.
- Integrate NiFi with Schema Registry.
- Build your NiFi dataflow.
-
At the point in your dataflow where you want to either consume from a Kafka topic,
or publish to a Kafka topic, add one of the following two Processors:
-
ConsumeKafkaRecord_0_10
-
PublishKafkaRecord_0_10
-
-
Configure your Kafka Processor with the following information:
-
Kafka Brokers – Provide a comma-separated list of Kafka Brokers you want to use in your dataflow.
-
Topic Name – The name of the Kafka topic to which you want to publish or from which you want to consume data.
-
Record Reader – Provide the Controller Service you want to use to read incoming FlowFile records.
-
Record Writer – Provide the Controller Service you want to use to serialize record data before sending it to Kafka.
-