Creating Input Transforms
- The source is not in your control, for example, data feed from a third-party provider
- The format is hard to change, for example, a legacy feed, other teams of feeds within your organization
- The messages are inconsistent
- The data from the sources do not have uniform keys, or without keys (like nested arrays), but are still in a valid JSON format
- The schema you want does not match the incoming topic
- Allows one transformation per source.
- Takes record as a JSON-formatted string input variable. The input is always named record.
- Emits the output of the last line to the calling JVM. It could be any variable name. In the following example, out and emit is used as a JSON-formatted string.
var out = JSON.parse(record.value); // record is input, parse JSON formatted string to object // add more transformations if needed JSON.stringify(out); // emit JSON formatted string of object
Navigate to the Streaming SQL Console.
The Streaming SQL Console opens in a new window.
- Navigate to , and select the environment where you have created your cluster.
- Select the Streaming Analytics cluster from the list of Data Hub clusters.
- Select Streaming SQL Console from the list of services.
- Select Console from the left-hand menu.
You can add the Input Transform to the Kafka table when you create the Kafka table:
You can add the Input Transform to an already existing Kafka table:
- Choose Apache Kafka from the Add table drop-down.
The Kafka table wizard appears.
- Select the edit button for the Kafka table you want to add a transformation.
You have the following options to insert your Input Transform:
Make sure the output of your transform matches the Schema definition detected or defined for the Kafka table.
Click Install default template and schema.
The Install Default template and schema option fills out the Data Transformation box with a template that you can use to create the Input Transform, and matches the schema with the format.
- Click Save changes.