Known issues
Review the list of known issues in Cloudera Flow Management (CFM).
Known issues in CFM 2.1.6
Review the list of known issues in Cloudera Flow Management (CFM) 2.1.6.
- Flow deletion fails for imported flows with funnels
-
When attempting to delete a flow that was imported from open-source Apache NiFi into Cloudera Flow Management, you may encounter an error if the flow contains funnels. The issue is triggered during the two-stage commit process, resulting in a failure to complete the deletion.
The NiFi UI displays a generic error, and backend logs indicate a failure in the two-stage commit process during funnel deletion.
-
NiFi UI error message:
Node <hostname>:8443 is unable to fulfill this request due to: An unexpected error has occurred.
-
Log message in nifi-app.log:
Received a status of 500 for request DELETE /nifi-api/funnels/<UUID> when performing first stage of two-stage commit.
Background:
This is a flow design issue. NiFi supports flexible flow design, which allows you to create many different combinations of components and connections. In some cases, such as flows that include funnels with loops or complex connection paths, this flexibility can lead to problems during deletion. The platform does not block such designs but some designs can cause issues with how components are deleted. When this happens, NiFi returns a generic error message that does not provide actionable feedback.
-
- LDAP authentication error caused by special characters in passwords
-
After upgrading to CFM 2.1.7 SP1, 2.1.7, 2.1.6 SP1 or 2.1.6 versions, you may encounter the following error if you use LDAP:
The issue is caused by the use of special characters in LDAP passwords, which are configured under the "Manager Password" property.org.springframework.ldap.AuthenticationException: LDAP: error code 49 - Truststore changes with Ranger Plugin causing TLS handshake errors
-
When using the Ranger plugin, the default truststore is changed from
cacertsto AutoTLS truststore (cm-auto-global_truststore.jks). This can lead to unintended issues such as TLS handshake errors with common CAs. Connections with common CAs may fail, causing service outages because the AutoTLS truststore contains only internal CA certificates and not the public root certificates.
- Snowflake - NoSuchMethodError
- Due to the issue documented in NIFI-11905, you may encounter the following error when utilizing Snowflake
components:
java.lang.NoSuchMethodError: 'net.snowflake.client.jdbc.telemetry.Telemetry net.snowflake.client.jdbc.telemetry.TelemetryClient.createSessionlessTelemetry(net.snowflake.client.jdbc.internal.apache.http.impl.client.CloseableHttpClient, java.lang.String)' at net.snowflake.ingest.connection.TelemetryService.<init>(TelemetryService.java:68)
- Per Process Group Logging
- The Per Process Group logging feature is currently not working even when you specify a log suffix in the configuration of a process group. As a result, you may not observe the expected logging behavior.
- Configuration of java.arg.7
- A property has been added for defining
java.arg.7to provide the ability to override the default location of the temporary directory used by JDK. By default this value is empty in Cloudera Manager. If you use this argument for another purpose, change it to a different, unused argument number (or use letters instead:java.arg.mycustomargument). Not changing the argument can impact functionalities after upgrades/migrations.
- JDK error
- JDK 8 version u252 is supported. Any lower version may result in this error when NiFi starts:
- JDK limitation
- JDK 8u271, JDK 8u281, and JDK 8u291 may cause socket leak issues in NiFi due to JDK-8245417 and JDK-8256818. Verify the build version of your JDK. Later builds are fixed as described in JDK-8256818.
- Kudu Client
- All the records are sent as a single Kafka message containing an
array of records.
There is an issue in the Kudu client preventing the creation of a new tables using the NiFi processors. The table needs to exist before NiFi tries to push data into it. You may see this error when this issue arises:
Caused by: org.apache.kudu.client.NonRecoverableException: failed to wait for Hive Metastore notification log listener to catch up: failed to retrieve notification log events: failed to open Hive Metastore connection: SASL(-15): mechanism too weak for this user
- NiFi Node Connection test failures
- In CFM 2.1.3, Cloudera Manager includes a new health check feature. The health check alerts users if a NiFi instance is running but disconnected from the NiFi cluster. For this health check to be successful, you must update a Ranger policy. There is a known issue when the NiFi service is running but the NiFi Node(s) report Bad Health due to the NiFi Node Connection test.
- NiFi UI Performance considerations
- A known issue in Chrome 92.x causes significant slowness in the
NiFi UI and may lead to high CPU consumption.
For more information, see the Chrome Known Issues documentation at 1235045.
- SSHJ version change and key negotiation issue with old SSH servers
- ListSFTP and PutSFTP processors fail when using the legacy ssh-rsa
algorithm for authentication with the following
error:
UserAuthException: Exhausted available authentication methods
- KeyStoreException: placeholder not found
- After an upgrade, NiFi may fail to start with the following
error:
WARN org.apache.nifi.web.server.JettyServer: Failed to start web server... shutting down. java.security.KeyStoreException: placeholder not found
The error is caused by missing configuration for the type of the keystore and truststore files.
- InferAvroSchema may fail when inferring schema for JSON data
- In Apache NiFi 1.17, the dependency on Apache Avro has been upgraded to 1.11.0. However, the InferAvroSchema processor depends on the hadoop-libraries NAR from which the Avro version comes from, causing a NoSuchMethodError exception. Having well defined schemas ensures consistent behavior, allows for proper schema versioning and prevents downstream systems to generate errors because of unexpected schema changes. Besides, schema inference may not always be 100% accurate and can be an expensive operation in terms of performances.
Known issues in CFM 2.1.6 SP1
Review the list of known issues in Cloudera Flow Management (CFM) 2.1.6 SP1.
- LDAP authentication error caused by special characters in passwords
-
After upgrading to CFM 2.1.7 SP1, 2.1.7, 2.1.6 SP1 or 2.1.6 versions, you may encounter the following error if you use LDAP:
The issue is caused by the use of special characters in LDAP passwords, which are configured under the "Manager Password" property.org.springframework.ldap.AuthenticationException: LDAP: error code 49
