ReadyFlow: Kafka to S3 Avro
You can use the Kafka to S3 Avro ReadyFlow to move your data from a Kafka topic to an Amazon S3 bucket.
This ReadyFlow consumes JSON, CSV or Avro data from a source Kafka topic and merges the events into Avro files before writing the data to S3. The flow writes out a file every time its size has either reached 100MB or five minutes have passed. Files can reach a maximum size of 1GB. You can specify the topic you want to read from as well as the target S3 bucket and path. Failed S3 write operations are retried automatically to handle transient issues. Define a KPI on the failure_WriteToS3 connection to monitor failed write operations.
ReadyFlow details | |
---|---|
Source | Kafka topic |
Source Format | JSON, CSV, Avro |
Destination | Amazon S3 |
Destination Format | Avro |
Moving data to object stores
Cloud environments offer numerous deployment options and services. There are many ways to store data in the cloud, but the easiest option is to use object stores. Object stores are extremely robust and cost-effective storage solutions with multiple levels of durability and availability. You can include them in your data pipeline, both as an intermediate step and as an end state. Object stores are accessible to many tools and connecting systems, and you have a variety of options to control access.