Creating Kafka tables using Templates

The built-in templates allow you to simply and easily create tables by filling out the imported CREATE TABLE statement in the SQL window with detailed description of the properties.

You can create tables directly from the SQL window on the Console page by using the pre-defined connector templates.

When using the predefined templates, you have the following options for the Kafka table:
CDP Kafka
Automatically using the Kafka service that is registered in the Data Providers, and runs on the same cluster as the SQL Stream Builder service. You can choose between JSON, Avro and CSV data types.
When connecting to a Kafka service that is not hosted in your cluster. You can choose between JSON, Avro, CSV and raw data types.
Upsert Kafka
Connecting to a Kafka service in the upsert mode. This means that when using it as a source, the connector produces a changelog stream, where each data record represents an update or delete event. The value in the data records is interpreted as an update of the last value for the same key. When using the table as a sink, the connector can consume a changelog stream, and write insert/update_after data as normal Kafka message valuea. Null values are represented as delete.
You can access and import the Templates from Streaming SQL Console:
  1. Navigate to the Streaming SQL Console.
    1. Go to your cluster in Cloudera Manager.
    2. Click on SQL Stream Builder from the list of Services.
    3. Click on the SQLStreamBuilder Console.
    The Streaming SQL Console opens in a new window.
  2. Click Create Job or select a previous job on the Getting Started page.

    You are redirected to the Console page.

  3. Click Templates at the SQL Editor.
  4. Select the template you want to use.

    The template is imported to the SQL window.

  5. Customize the fields of the template.
  6. Click Execute.

    The table is created based on the selected template, and appears next to the SQL Editor.