Known issues and limitations
Learn about the known issues in Flink and SQL Stream Builder, the impact or changes to the functionality, and the workaround in Cloudera Streaming Analytics 1.9.0.
SQL Stream Builder
- FLINK-18027: ROW value constructor cannot deal with complex expressions
- When querying data from a table or a view with a
ROW()
function an exception is thrown due to a Calcite parsing issue. For example, the following query will return an error:CREATE VIEW example AS SELECT col1, ROW(col2) FROM table; SELECT * FROM example;
- Auto discovery is not supported for Apache Knox
- You need to manually configure Knox with SQL Stream Builder to enable Knox authentication.
- Streaming SQL Console cannot be accessed through Knox when High Availability is enabled
- When SQL Stream Builder SSB is deployed in High Availability with Load Balancer, the Streaming SQL Console cannot be accessed directly using Apache Knox.
- SSB service fails when using Active Directory (AD) Kerberos authentication
- If you use AD Kerberos for authentication and the Load Balancer URL is not provided, it can cause the SQL Stream Builder (SSB) service to fail. The issue is caused by the keytab generation. When the keytab is generated by Cloudera Manager it requires the principals from the AD for the Load Balancer host, and without no host specified for the Load Balancer, the SSB service cannot be started by Cloudera Manager. This issue also persists when the Load Balancer role is not deployed or used with SSB.
- CSA-4650: Inconsistent sidebar collapse behavior
- The sidebar is collapsed inconsistently on the homepage of Streaming SQL Console when opening a project.
- CSA-4643: flink-yarn-session is ignoring command line parameters
- When adding parameters to the Flink session using
flink-yarn-session -d
in command line, the parameters are not applied to the session. - CSA-4548: Files cannot be uploaded through Swagger
- REST API endpoints that take multipart requests, such as the uploading artifact endpoint, result in error.
- CSA-4427: State of Execute and Stop options in Job context menu do not correspond the Job state
- When a job is opened in Tab view, the Execute and Stop actions are disabled when managing the job from the Explorer view.
- CSA-4426: Kafka Data Source name accepts spaces
- Kafka Data Source can be validated and created with spaces in the Data Source name, but this results in errors as spaces are not valid characters based on the naming convention.
- CSA-4425: Password in Kafka Data Source can be revealed after save
- The show password icon can be used after saving the password for the authentication method when creating a Kafka Data Source.
- CSA-4412: Cannot delete Materialized View endpoint when using dynamic parameters
- Materialized View endpoints cannot be deleted if dynamic parameters were set for them.
- CSA-4400: Cannot delete invalid catalog
- A catalog can be created without a catalog service, but deleting the invalid catalog fails as it is not registered in Flink without a service.
- CSA-4370: Virtual tables imported from a schema in a Schema Catalog fail to describe correctly
- Describing fails for a Virtual Table from a schema in a Schema Registry Catalog when viewing the table DDL.
- CSA-4333: Use Kafka Timestamps switch reflects invalid value
- After creating a Kafka Virtual Table and disabling the Use Kafka Timestamps configuration, the table is created successfully according to the setting, but when viewing the DDL of the table, it shows the configuration as enabled.
- CSA-4030: Webhook sending fails when webhook template is empty string
- When creating a webhook table with a custom template, the webhook template will be saved as an empty string, which results in webhook sending failure.
- CSA-3754: The display name of the loadbalancer.url property should be "Load Balancer Host"
- The
loadbalancer.url
property is duplicated in Cloudera Manager on the SQL Stream Builder configuration page.
Flink
- FLINK-18027: ROW value constructor cannot deal with complex expressions
- When querying data from a table or a view with a
ROW()
function an exception is thrown due to a Calcite parsing issue. For example, the following query will return an error:CREATE VIEW example AS SELECT col1, ROW(col2) FROM table; SELECT * FROM example;
In Cloudera Streaming Analytics, the following SQL API features are
in preview:
- Match recognize
- Top-N
- Stream-Table join (without rowtime input)
- DataStream conversion limitations
-
- Converting between Tables and POJO DataStreams is currently not supported in CSA.
- Object arrays are not supported for Tuple conversion.
- The
java.time
class conversions for Tuple DataStreams are only supported by using explicitTypeInformation
:LegacyInstantTypeInfo
,LocalTimeTypeInfo.getInfoFor
(LocalDate
/LocalDateTime
/LocalTime.class
). - Only
java.sql.Timestamp
is supported for rowtime conversion,java.time.LocalDateTime
is not supported.
- Kudu catalog limitations
-
CREATE TABLE
- Primary keys can only be set by the
kudu.primary-key-columns
property. Using thePRIMARY KEY
constraint is not yet possible. - Range partitioning is not supported.
- Primary keys can only be set by the
- When getting a table through the catalog,
NOT NULL
andPRIMARY KEY
constraints are ignored. All columns are described as being nullable, and not being primary keys. - Kudu tables cannot be altered through the catalog other than simply renaming them.
- Schema Registry catalog limitations
-
- Currently, the Schema Registry catalog / format only supports reading messages with the latest enabled schema for any given Kafka topic at the time when the SQL query was compiled.
- No time-column and watermark support for Registry tables.
- No
CREATE TABLE
support. Schemas have to be registered directly in theSchemaRegistry
to be accessible through the catalog. - The catalog is read-only. It does not support table deletions or modifications.
- By default, it is assumed that Kafka message values contain the schema id as a
prefix, because this is the default behaviour for the
SchemaRegistry
Kafka producer format. To consume messages with schema written in the header, the following property must be set for the Registry client:store.schema.version.id.in.header: true
.