Known issues and limitations in Cloudera Data Warehouse Private Cloud
This section lists known issues and limitations that you might run into while using the Cloudera Data Warehouse (CDW) service CDP Private Cloud.
CDW Private Cloud general known issues and limitations
- DWX-5941: Custom service accounts are not supported in CDW Private Cloud 1.0
- Problem: CDP Private Cloud uses a set of default names for service accounts.
spark. You can overwrite these default service account names during the Cloudera Manager installation of CDP Private Cloud 1.0. However, Cloudera Data Warehouse (CDW) experience in CDP Private Cloud 1.0 does not support custom service account names. If custom service account names are used, activation of environments for CDW fail, creation of the default Database Catalog fails, and no Virtual Warehouses will start.
- Workaround: Be sure to accept default service account names when you install CDP Private Cloud 1.0 with Cloudera Manager.
- DWX-5091: SSL-enabled PostgreSQL database required for Hive Metastore on the CDP base cluster on the Cloudera Manager side
The database that is used for the Hive Metastore on the base CDP cluster (Cloudera Manager side) must be PostgreSQL, and must meet the following requirements:
To use the same keystore with embedded certificate for Ranger and Atlas:
- Uses the same keystore containing an embedded certificate as Ranger and Atlas uses.
- If you are using Auto-TLS:
In the Management Console Administration page, navigate toand add the certificate name (for example,
/path/to/postgres.pem) in the Trusted CA Certificate option.
- If you are not using Auto-TLS:
Ensure that the public certificate of the certificate authority (CA) that signed the Hive metastore database's certificate is present in Cloudera Manager's JKS truststore. If the certificate is self-signed, import that certificate into Cloudera Manager's JKS truststore: In the Management Console Administration page, find the path to Cloudera Manager's JKS truststore by navigating to. Import the CA's certificate into that JKS file.
To add the certificate name to an existing or a new JKS file, use the following
keytoolcommand, which uses the same example certificate name:
keytool -import -alias postgres -file /path/to/postgres.pem -storetype JKS -keystore /path/to/cm.jks
/path/to/cm.jksis the JKS file that is configured by Cloudera Manager.
This ensures that the file specified for Cloudera Manager TLS/SSL Client Trust Store File is passed to Management Console and workloads.
- DWX-5129: Management Console and CDW service workloads must reside in the same OpenShift cluster
- Problem: If Management Console and the CDW service workload reside on two different OpenShift clusters, the workload will fail.
- Workaround: Ensure during environment registration in Management Console that you point to the same OpenShift cluster as you did during installation.
Environments on OpenShift Clusters for CDW Private Cloud
- DWX-4723: HTTP 504 Gateway timeout error returned when you delete an environment
- Problem: When a CDW Private Cloud environment is deleted in the CDW Private Cloud UI, an HTTP 504 Gateway timeout error can be returned.
- Workaround: Using the OpenShift CLI, increase the route timeout to 60s on the
For more information about setting this timeout, see the OpenShift documentation.
oc annotate route <route-name> --overwrite haproxy.router.openshift.io/timeout=60s
Database Catalogs on CDW Private Cloud
- DWX-4534: Default Database Catalog fails to start when Hive metastore on base cluster does not use PostgreSQL database
- In CDW Private Cloud version 1.0, you must use a PostgreSQL database for your Hive metastore on the base cluster. For more information, see Base cluster database requirements for CDW Private Cloud.
- DWX-5247: Database Catalog managed tables location points to incorrect directory
- Problem: Default Database Catalogs that are created with the CDW Private Cloud UI point to an incorrect directory location for managed tables.
- Workaround: To correct this, perform the following steps:
- Using Data Analytics Studio (DAS), run a
DESCRIBEstatement to check the
Note the HDFS URL because you will need it to correct the setting in Step 3.
DESCRIBE database default; DB_NAME COMMENT LOCATION MANAGEDLOCATION more… default Default Hive database hdfs://ns1/warehouse/dwx- more... hdfs://ns1/user/hive/warehouse more…
- Still using DAS, run the following query to get the correct path for the
Note that this path does not include the HDFS URL that you obtained in Step 1.
SET hive.metastore.warehouse.dir; Result: hive.metastore.warehouse.dir=/warehouse/dwx-impala7b096/warehouse-1597085504-cv4b/warehouse/tablespace/managed/hive
- A user with
ALTERprivileges on the
defaultdatabase sets the database's
MANAGEDLOCATIONusing the following elements from previous steps:
- HDFS URL from Step 1:
- Results returned by the
SETstatement used in Step 2:
ALTER DATABASE default SET MANAGEDLOCATION 'hdfs://ns1/warehouse/dwx-impala7b096/warehouse-1597085504-cv4b/warehouse/tablespace/managed/hive';
- HDFS URL from Step 1:
- Using Data Analytics Studio (DAS), run a
Hive Virtual Warehouses on CDW Private Cloud
- Hive ACID is not supported in Hive Virtual Warehouses that use Cloudera Runtime 7.1.3
- Problem: Hive ACID is not supported for Hive Virtual Warehouses running on Cloudera Runtime 7.1.3 in CDP Private Cloud Base.
- Workaround: If you must use Hive ACID for your Hive Virtual Warehouse in CDW Private Cloud, make sure you are using Cloudera Runtime 7.1.4.
- DWX-4703: Using UDFs and custom JARs are not supported
- Problem: In the GA (general availability) release of Cloudera Data Warehouse Private Cloud, using UDFs and custom JARs are not supported on Hive Virtual Warehouses.
- Workaround: none
- DWX-4842: Entities are not being created in Atlas
- Problem: Base clusters that are using Java 11 might be using truststores in PKCS12 format. Currently, Hive Virtual Warehouses on CDW Private Cloud only supports truststores in JKS format. This prevents the entities from being created in Atlas.
- Workaround: Using the
keytool, convert the PKCS12 truststore in the base cluster to a JKS truststore.
- DWX-5241: Intermittent query execution failures
- Problem: Query execution might fail producing
return code 2 from org.apache.hadoop.hive.ql.exec.tez.TexTaskwith multiple Vertex class errors.
- Workaround: To address this issue set either of the following properties to
falseat the session-level:
Impala Virtual Warehouses on CDW Private Cloud
- DWX-5172 and DWX-5229: Hue UI does not load
- Problem: On Impala Virtual Warehouses, the Hue UI does not load if unsecured LDAP or OpenLDAPs is being used for authentication.
- Workaround: To load the Hue UI for Impala Virtual Warehouses, use Active Directory LDAP over SSL for authentication (LDAPS).
Hue on CDW Private Cloud
- DWX-5229 and DWX-5172: Hue only supports Active Directory LDAP over SSL authentication
- Logging in to the Hue UI on CDW Private Cloud version 1.0 fails if OpenLDAP is the configured authentication method. Instead, Hue only supports Active Directory LDAP over SSL in CDW Private Cloud version 1.0.