Adjusting logging configuration in Advanced Settings

You can customize the logging configurations for SQL Stream Builder (SSB) jobs on the Streaming SQL Console globally or at job level. Adjusting the log configuration enables you to control the log levels of all the underlying libraries: Flink, Hadoop, Kafka, Zookeeper, other common libraries, and connectors to get more or less information in your job’s log.

The customization of the log configuration works differently whether it’s applied globally or at job level based on the job deployment mode.

Global logging configuration

To set the global logging configuration:

  1. Navigate to the Streaming SQL Console.
    1. Go to your cluster in Cloudera Manager.
      1. Select SQL Stream Builder from the list of services.
      2. Click SQLStreamBuilder Console.
  2. The Streaming SQL Console opens in a new window.
  3. In the left hand navigation click Configuration.
  4. If no global configuration is present, acknowledge the system notice by clicking Set global configuration.
  5. In the editor, set the configuration.
  6. Click Save.
Session mode
The execution.target is set to yarn-session mode, this is the default execution mode.
The log configuration is set at the start time if the Flink YARN session is applied to every job execution. For example, the current log configuration is applied if and only if the Flink YARN session is not set on the Session tab of the Compose page.
Per-job mode
The execution.target is set to yarn-per-job mode.
When you change the default execution mode to per-job, the currently applied log configuration is going to be used for the job. To configure the execution mode, you need to start the SQL query with the following line:
SET 'execution.target'='yarn-per-job';
  1. Click New Job or select a previous job on the Jobs page.
    You are redirected to the SQL Editor of the job.
  2. Click Job Settings.
  3. Click Logging.
  4. Click Set Log Configuration.
  5. Modify the settings based on your requirements.
  6. Click Save. (You can restore the default logging configuration through clicking Delete and Set Log Configuration again.)
  7. Close the Job Settings window.
  8. Add and execute a SQL statement.
  9. Click SQL Jobs.
  10. Search for the job you have executed previously.
  11. Click Flink Dashboard.
    The Flink Dashboard opens in a new window.
  12. Click Task Managers > Logs.
    The log information appears in the log window based on your custom configurations.