Hive
You can review the list of reported issues and their fixes for Hive in 7.3.1.100.
- CDPD-74456: Spark3 hwc.setDatabase() writes to the correct database
- When setting the database using hive.setDatabase("DB") and performing CREATE TABLE or write operations with Hive Warehouse Connector (HWC), the operations were executed in a default database. This issue is now resolved and the operations are executed in the correct database.
- CDPD-74373: Timestamp displays incorrectly in Spark HWC with JDBC_READER mode
- When using Spark HWC with JDBC_READER mode, timestamps were displayed incorrectly. For example, 0001-01-01 00:00:00.0 was interpreted as 0000-12-30 00:00:00.
- CDPD-76932: Incorrect query results due to TableScan merge in shared work optimizer
- During shared work optimization, TableScan operators were merged even when they had different Dynamic Partition Pruning (DPP) parent operators. This caused the filter from the missing DPP operator to be ignored, leading to incorrect query results.
- CDPD-78115: Thread safety issue in HiveSequenceFileInputFormat
- Concurrent queries returned incorrect results when query result caching was disabled due to a thread safety issue in HiveSequenceFileInputFormat.
- CDPD-78129: Materialized view rebuild failure due to stale locks
- If a materialized view rebuild is aborted, the lock entry in the
materialization_rebuild_locks table is not removed. This prevents subsequent rebuilds of the
same materialized view, causing
error
Error: Error while compiling statement: FAILED: SemanticException org.apache.hadoop.hive.ql.parse.SemanticException: Another process is rebuilding the materialized view view_name (state=42000, code=40000)
- CDPD-78166: Residual operator tree in shared work optimizer causes dynamic partition pruning errors
- Shared work optimizer left unused operator trees that sent dynamic partition pruning events to non-existent operators. This caused query failures when processing these events, leading to errors in building the physical operator tree.
- CDPD-78113: Conversion failure from RexLiteral to ExprNode for empty strings
- Conversion from RexLiteral to ExprNode failed when the literal was an empty string, causing the cost-based optimizer to fail for queries.