If your cluster has Schema Registry, NiFi, and Kafka, you can use NiFi processors to
integrate Schema Registry with Kafka. First integrate NiFi with Schema Registry, build the
NiFi dataflow, and then add and configure the necessary Kafka processors.
-
Integrate NiFi with Schema Registry.
-
Build your NiFi dataflow.
-
At the point in your dataflow where you want to either consume from a Kafka topic,
or publish to a Kafka topic, add one of the following two processors:
- ConsumeKafkaRecord_0_10
- PublishKafkaRecord_0_10
-
Configure your Kafka processor with the following information:
-
Kafka Brokers – Provide a comma-separated list of Kafka Brokers you want to
use in your dataflow.
-
Topic Name – The name of the Kafka topic to which you want to publish or
from which you want to consume data.
-
Record Reader – Provide the Controller Service you want to use to read
incoming FlowFile records.
-
Record Writer – Provide the Controller Service you want to use to serialize
record data before sending it to Kafka.