Creating Input Transforms

Input Transforms are a powerful way to clean, modify, and arrange data that is poorly organized, has changing format, has data that is not needed or otherwise hard to use. With the Input Transfrom feature of SQL Stream Builder, you can create a javascript function to transform the data after it has been consumed from a Kafk topic, and before you run SQL queries on the data.

You can use Input Transforms in the following situations:
  • The source is not in your control, for example, data feed from a third-party provider
  • The format is hard to change, for example, a legacy feed, other teams of feeds within your organization
  • The messages are inconsistent
  • The data from the sources do not have uniform keys, or without keys (like nested arrays), but are still in a valid JSON format
  • The schema you want does not match the incoming topic
You can use the Input Transforms on Kafka tables that have the following characteristics:
  • Allows one transformation per source.
  • Takes record as a JSON-formatted string input variable. The input is always named record.
  • Emits the output of the last line to the calling JVM. It could be any variable name. In the following example, out and emit is used as a JSON-formatted string.
A basic input transformation looks like this:
var out = JSON.parse(record.value);     // record is input, parse JSON formatted string to object
                                                 // add more transformations if needed
JSON.stringify(out);                             // emit JSON formatted string of object
  1. Navigate to the Streaming SQL Console.
    1. Navigate to Management Console > Environments, and select the environment where you have created your cluster.
    2. Select the Streaming Analytics cluster from the list of Data Hub clusters.
    3. Select Streaming SQL Console from the list of services.
    The Streaming SQL Console opens in a new window.
  2. Select Console from the left-hand menu.
  3. Select Tables.
    You can add the Input Transform to the Kafka table when you create the Kafka table:
    1. Choose Apache Kafka from the Add table drop-down.
    You can add the Input Transform to an already existing Kafka table:
    1. Select the edit button for the Kafka table you want to add a transformation.
    The Kafka table wizard appears.
  4. Click Transformations.
    You have the following options to insert your Input Transform:
    1. Add your javascript transformation code to the Data Transformation box.
      Make sure the output of your transform matches the Schema definition detected or defined for the Kafka table.
    2. Click Install default template and schema.
      The Install Default template and schema option fills out the Data Transformation box with a template that you can use to create the Input Transform, and matches the schema with the format.
  5. Click Save changes.