Deserialization tab

When creating a Kafka table, you can configure how to handle errors due to schema mismatch using DDL or the Kafka wizard.

You can configure every supported type of Kafka connectors (local-kafka, kafka or upsert) how to handle if a message fails to deserialize which can result in job submission error. You can choose from the following configurations:

Fail
In this case an exception is thrown, and the job submission fails
Ignore
In this case the error message is ignored without any log, and the job submission is successful
Ignore and Log
In this case the error message is ignored, and the job submission is successful
Save to DLQ
In this case the error message is ignored, but you can store it in a dead-letter queue (DLQ) Kafka topic

Using the Kafka wizard

When you create the Kafka table using the wizard on the Streaming SQL Console, you can configure the error handling with the following steps:
  1. Navigate to the Streaming SQL Console.
    1. Navigate to Management Console > Environments, and select the environment where you have created your cluster.
    2. Select the Streaming Analytics cluster from the list of Data Hub clusters.
    3. Select SQL Stream Builder from the list of services.
  2. Click Compose tab.
  3. Click Tables tab.
  4. Select Add tables > Apache Kafka.

    The Add Kafka table window appears

  5. Select Deserialization tab.
  6. Choose from the following policy options under Deserialization Policy:
    • Fail
    • Ignore
    • Ignore and Log
    • Save to DLQ

      If you choose the Save to DLQ option, you need to create a dedicated Kafka topic where you store the error message. After selecting this option, you need to further select the created DLQ topic.

    • Click Save Changes.

Using DDL

When you create the Kafka table using DDL on the Streaming SQL Console, you can configure the error handling with the following optional arguments:
  1. Navigate to the Streaming SQL Console.
    1. Navigate to Management Console > Environments, and select the environment where you have created your cluster.
    2. Select the Streaming Analytics cluster from the list of Data Hub clusters.
    3. Select SQL Stream Builder from the list of services.
  2. Click Compose tab.
  3. Choose one of the Kafka template types from Templates.
  4. Select any type of data format.

    The predefined CREATE TABLE statement is imported to the SQL Window.

  5. Fill out the Kafka template based on your requirements.
  6. Search for the deserialization.failure.policy.
  7. Provide the value for the error handling from the following options:
    1. ‘error’
    2. ‘ignore’
    3. ‘ignore_and_log’
    4. ‘dlq’

      If you choose the dlq option, you need to create a dedicated Kafka topic where you store the error message. After selecting this option, you need to further provide the name of the created DLQ topic.

  8. Click Execute.