Known Issues in Apache Hive

Learn about the known issues in Hive, the impact or changes to the functionality, and the workaround.

HIVE-26736: Authorization failure for nested Views having WITH clause
This is a known issue in 7.1.7 SP2. Workaround is not available for SP2. This has been fixed by HOTFIX-5396 (CHF21). This will be fixed in SP2 CHF1.
None.
HIVE-26837: CTLT with hive.create.as.external.legacy as true creates managed table instead of external table
This is a known issue in 7.1.7 SP2. Workaround is not available for SP2. This has been fixed by HOTFIX-5439 (CHF24). This will be fixed in SP2 CHF1.
None.
HIVE-26799: Make authorizations on custom UDFs involved in tables/view configurable.
This is a known issue in 7.1.7 SP2. Workaround is not available for SP2. This has been fixed by HOTFIX-5438. This will be fixed in SP2 CHF1.
None.
HIVE-24188: CTLT from MM to External or External to MM are failing with hive.strict.managed.tables & hive.create.as.acid.
This is a known issue in 7.1.7 SP2. Workaround is not available for SP2. This has been fixed by HOTFIX-5439 (CHF24). This will be fixed in SP2 CHF1.
None.
CDPD-41274: HWC + Oozie issue: Could not open client transport with JDBC Uri
Currently only Spark cluster mode is supported in the Oozie Spark Action with Hive Warehouse Connector (HWC).
Use Spark action in cluster mode.
<spark xmlns="uri:oozie:spark-action:1.0">
            ...
            <mode>cluster</mode>
            ...
            </spark>
CDPD-21365: Performing a drop catalog operation drops the catalog from the CTLGS table. The DBS table has a foreign key reference on CTLGS for CTLG_NAME. Because of this, the DBS table is locked and creates a deadlock.
You must create an index in the DBS table on CTLG_NAME: CREATE INDEX CTLG_NAME_DBS ON DBS(CTLG_NAME);.
CDPD-26556 After an upgrade, querying a CTAS table under certain conditions might throw an exception
If you upgrade your Hive cluster from CDH 6 to CDP 7, create a CTAS table in the CDP cluster from a table you upgraded from CDH, you might see the following exception when you query the new table:
class org.apache.hadoop.io.IntWritable cannot be cast to class org.apache.hadoop.hive.serde2.objectinspector.StandardUnionObjectInspector$StandardUnion        

This issue involves CDH-based tables having columns of complex types ARRAY, MAP, and STRUCT.

CDPD-23506: OutOfMemoryError in LLAP
Long running spark-shell applications can leave sessions in interactive Hiveserver2 until the Spark application finishes (user exists from spark-shell), causing memory pressure in case of a high number of queries in the same shell (1000+).
You must close spark-shell so that sessions are closed. Add the owner of the database or the tables as a user with read or read/write access to the tables directly.
CDPD-23041: DROP TABLE on a table having an index does not work
If you migrate a Hive table to CDP having an index, DROP TABLE does not drop the table. Hive no longer supports indexes (HIVE-18448). A foreign key constraint on the indexed table prevents dropping the table. Attempting to drop such a table results in the following error:
java.sql.BatchUpdateException: Cannot delete or update a parent row: a foreign key constraint fails ("hive"."IDXS", CONSTRAINT "IDXS_FK1" FOREIGN KEY ("ORIG_TBL_ID") REFERENCES "TBLS ("TBL_ID"))
There are two workarounds:
  • Drop the foreign key "IDXS_FK1" on the "IDXS" table within the metastore. You can also manually drop indexes, but do not cascade any drops because the IDXS table includes references to "TBLS".
  • Launch an older version of Hive, such as Hive 2.3 that includes IDXS in the DDL, and then drop the indexes as described in Language Manual Indexing.
Apache Issue: Hive-24815
CDPD-17766: Queries fail when using spark.sql.hive.hiveserver2.jdbc.url.principal in the JDBC URL to invoke Hive.
Do not specify spark.sql.hive.hiveserver2.jdbc.url.principal in the JDBC URL to invoke Hive remotely.
Workaround: specify principal=hive.server2.authentication.kerberos.principal as shown in the following syntax:
jdbc:hive://<host>:<port>/<dbName>;principal=hive.server2.authentication.kerberos.principal;<otherSessionConfs>?<hiveConfs>#<hiveVars>
CDPD-13636: Hive job fails with OutOfMemory exception in the Azure DE cluster
Set the parameter hive.optimize.sort.dynamic.partition.threshold=0. Add this parameter in Cloudera Manager (Hive Service Advanced Configuration Snippet (Safety Valve) for hive-site.xml)
CDPD-10848: HiveServer Web UI displays incorrect data
If you enabled auto-TLS for TLS encryption, the HiveServer2 Web UI does not display the correct data in the following tables: Active Sessions, Open Queries, Last Max n Closed Queries