Configuring Kafka tables
The user defined Kafka table can be configured based on the schema, event time, input
transformations and other Kafka specific properties using either the Kafka wizard or
DDL.
Schema tab When using the Add Kafka table wizard on the Streaming SQL Console, you can configure the schema under the Schema tab.Event Time tab When using the Add Kafka table wizard on the Streaming SQL Console, you can configure the event time under the Event Time tab.Data Transformations tab When using the Add Kafka table wizard on the Streaming SQL Console, you can apply input transformation under the Transformations tab. Input transformations can be used to clean or arrange the incoming data from the source using javascript functions.Properties tab When using the Add Kafka table wizard on the Streaming SQL Console, you can configure the properties under the Properties tab.Deserialization tab When creating a Kafka table, you can configure how to handle errors due to schema mismatch using DDL or the Kafka wizard.Assigning Kafka keys in streaming queries Based on the Sticky Partitioning strategy of Kafka, when null keyed events are sent to a topic, they are randomly distributed in smaller batches within the partitions.Performance & Scalability The Kafka and SQL Stream Builder integration enables you to use the Kafka-specific syntax to customize your SQL queries based on your deployment and use case.