ReadyFlow: Kafka to Kafka
You can use the Kafka to S3 Avro ReadyFlow to move your data from a Kafka topic to an Amazon S3 bucket.
This ReadyFlow consumes data from a source Kafka topic and writes it to the specified destination topic. The flow does not apply any schema to the data. The data is simply passed through to the specified destination topic. Failed Kafka write operations are retried automatically to handle transient issues. Define a KPI on the failure_WriteToKafka connection to monitor failed write operations.
Moving data between Kafka topics
You can use the Kafka to Kafka ReadyFlow to move your data between two Kafka topics, while applying a schema to the data in Cloudera DataFlow (CDF).
This use case walks you through the steps of deploying a Kafka to Kafka ReadyFlow. You can use this flow when you want to move data from one Kafka topic to another.
Your ReadyFlow can consume JSON, CSV, or Avro data from the source Kafka topic and write to the destination Kafka topic in any of these formats. The data flow parses the schema by looking up the schema name in the CDP Schema Registry.