Creating Kafka tables using wizard
After registering a Kafka data source, you can use the Kafka table wizard in Streaming SQL Console to create a Kafka table.
You can also create Kafka tables using one of the Kafka templates. For more information, see the Kafka connectors and Using connectors with templates sections.
- Make sure that you have registered Kafka as a Data Source.
- Make sure that you have created topics in Kafka.
- Make sure there is generated data in the Kafka topic.
- Make sure that you have the right permissions set in Ranger.
Navigate to the Streaming SQL Console.
- Navigate to , and select the environment where you have created your cluster.
- Select the Streaming Analytics cluster from the list of Data Hub clusters.
Select Streaming SQL Console from the list of
The Streaming SQL Console opens in a new window.
Open a project from the Projects page of Streaming SQL
You are redirected to the Explorer view of the project.
- Select an already existing project from the list by clicking the Open button or Switch button.
- Create a new project by clicking the New Project button.
- Import a project by clicking the Import button.
- Click next to Virtual Tables.
.The Kafka Table window appears.
Provide a Table Name.
- Select a registered Kafka provider as Kafka cluster.
Select the Data format.
- You can select JSON as data format.
- You can select AVRO as data format.
Select a Kafka topic from the list.
Add a customized schema to Schema Definition or click
Detect Schema to read a sample of the JSON messages,
and automatically infer the schema.
Customize your Kafka Table with the following options:
For more information about how to configure the Kafka table, see the Configuring Kafka tables section.
- Configure the Event Time if you do not want to use the default Kafka Timestamps.
Configure an Input Transform on the Data
- Configure any Kafka properties required on the Properties tab.
- Select a policy for deserialization errors on the Deserialization tab.
- Click Create and Review.
FROMor at the