Creating Kafka tables using Templates
The built-in templates allow you to simply and easily create tables by filling out the imported CREATE TABLE statement in the SQL window with detailed description of the properties.
You can create tables directly from the SQL window on the Console page by using the pre-defined connector templates.
- CDP Kafka
- Automatically using the Kafka service that is registered in the Data Providers, and runs on the same cluster as the SQL Stream Builder service. You can choose between JSON, Avro and CSV data types.
- When connecting to a Kafka service that is not hosted in your cluster. You can choose between JSON, Avro, CSV and raw data types.
- Upsert Kafka
- Connecting to a Kafka service in the upsert mode. This means that when using it as a source, the connector produces a changelog stream, where each data record represents an update or delete event. The value in the data records is interpreted as an update of the last value for the same key. When using the table as a sink, the connector can consume a changelog stream, and write insert/update_after data as normal Kafka message valuea. Null values are represented as delete.
- Navigate to the Streaming SQL Console.
- Navigate to , and select the environment where you have created your cluster.
- Select the Streaming Analytics cluster from the list of Data Hub clusters.
- Select Streaming SQL Console from the list of services.
- Select Console from the main menu.
- Click Templates under the SQL window.
- Select the template you want to use.
The template is imported to the SQL window.
- Customize the fields of the template.
- Click Execute.
The table is created based on the selected template. You can review the table using the Table tab.