Adding Kafka as Data Provider

You need to register Kafka as a Data Provider using the Streaming SQL Console to create Kafka tables in SQL Stream Builder (SSB).

  • Make sure that you have Kafka service on your cluster.
  • Make sure that you have the right permissions set in Ranger.
  1. Navigate to the Streaming SQL Console.
    1. Navigate to Management Console > Environments, and select the environment where you have created your cluster.
    2. Select the Streaming Analytics cluster from the list of Data Hub clusters.
    3. Select Streaming SQL Console from the list of services.
    The Streaming SQL Console opens in a new window.
  2. Click Data Providers from the main menu.
  3. Click Register Kafka Provider.
    The Add Kafka Provider window appears.
  4. Add a Name to your Kafka provider.
  5. Add the broker host name(s) to Brokers.
    You need to copy the Kafka broker name(s) from Cloudera Manager.
    1. Go to your cluster in Cloudera Manager.
    2. Click Kafka from the list of services.
    3. Click Instances.
    4. Copy the hostname of the Kafka broker(s) you want to use.
    5. Go back to the Add Kafka Provider page.
    6. Paste the broker hostname to the Brokers field.
    7. Add the default Kafka port after the hostname(s).
  6. Select the Connection Protocol.
    The connection protocol must be the same as it is configured for the Kafka cluster in Cloudera Manager.

    You can choose from the following protocols:

    1. Select Plaintext, and click Save Changes.
    2. Select SSL, and click Save Changes.
    3. Select SASL/SSL, and choose an SASL Mechanism.
      1. Select Kerberos, and provide the Kafka Truststore location. Click Save Changes.
      2. Select Plain, and provide the SASL username and password. Click Save Changes.
You have registered Kafka as a data provider to be able to add Kafka as a table in your SQL query. The already existing Kafka topics can be selected when adding Kafka as a table.