List of required configuration parameters for the Kafka to Cloudera Operational Database ReadyFlow

When deploying the Kafka to Cloudera Operational Database ReadyFlow, you have to provide the following parameters. Use the information you collected in Prerequisites.

Table 1. Kafka to Cloudera Operational Database ReadyFlow configuration parameters
Parameter Name Description
CDP Workload User Specify the Cloudera machine user or workload user name that you want to use to authenticate to Kafka. Ensure this user has the appropriate access rights in Ranger for the source and target Kafka topics.
CDP Workload User Password Specify the Cloudera machine user or workload user name that you want to use to authenticate to Kafka.
CDPEnvironment Use this parameter to upload the hbase-site.xml file of your target Hbase cluster. Cloudera DataFlow will also use this parameter to auto-populate the Flow Deployment with additional Hadoop configuration files required to interact with HBase.
COD Column Family Name Specify the column family to use when inserting data into Cloudera Operational Database.
COD Row Identifier Field Name Specify the name of a record field whose name should be used as the row ID for the given record.
COD Table Name Specify the target table name in Cloudera Operational Database.
CSV Delimiter If your source data is CSV, specify the delimiter here,
Data Input Format Specify the desired format for your output data. You can use "CSV", "JSON" or "AVRO" with this ReadyFlow.
Kafka Broker Endpoint Specify the Kafka bottstrap servers sting as a comma separated list.
Kafka Consumer Group ID Specify the id for the consumer group used for the source topic you are consuming from.
Kafka Source Topic Specify a topic name that you want to read from.
Schema Name Specify the schema name to be looked up in the Schema Registry.
Schema Registry Hostname Specify the host name of the Schema Registry you want to connect to. This must be the direct hostname of the Schema Registry itself, not the Knox endpoint.