Configuring Hive on Spark for Hive CLI
To use the Hive CLI, perform the following tasks in Cloudera Manager.
- Go to the Hive service.
- Click HiveServer2 in the Status Summary section.
- On the HiveServer2 page, click the Processes tab.
- In the Configuration Files section, click hive-site.xml.
Cloudera Manager opens the file in a new tab.
- Go to the new tab and copy all properties in the hive-site.xml file.
- Go back to the tab with the Cloudera Manager screen and click the breadcrumb link to go back to the Hive service page.
- Click the Configuration tab.
- Enter hive-site in the Search field.
- In the Hive Client Advanced Configuration Snippet (Safety Valve) text field, paste the content from hive-site.xml file.
Only keep the following properties:
- hive.auto.convert.join
- hive.auto.convert.join.noconditionaltask.size
- hive.optimize.bucketmapjoin.sortedmerge
- hive.smbjoin.cache.rows
- hive.exec.reducers.max
- hive.vectorized.groupby.checkinterval
- hive.vectorized.groupby.flush.percent
- hive.compute.query.using.stats
- hive.vectorized.execution.enabled
- hive.vectorized.execution.reduce.enabled
- hive.merge.mapfiles
- hive.merge.mapredfiles
- hive.cbo.enable
- hive.fetch.task.conversion
- hive.fetch.task.conversion.threshold
- hive.limit.pushdown.memory.usage
- hive.merge.sparkfiles
- hive.merge.smallfiles.avgsize
- hive.merge.size.per.task
- hive.optimize.reducededuplication
- hive.optimize.reducededuplication.min.reducer
- hive.map.aggr
- hive.map.aggr.hash.percentmemory
- hive.optimize.sort.dynamic.partition
- spark.executor.memory
- spark.driver.memory
- spark.executor.cores
- spark.master
- spark.yarn.driver.memoryOverhead
- spark.yarn.executor.memoryOverhead
- spark.dynamicAllocation.enabled
- spark.dynamicAllocation.minExecutors
- spark.dynamicAllocation.initialExecutors
- hive.entity.capture.input.URI
- spark.shuffle.service.enabled
- In the Search field, enter hive-env.
- In the Gateway Client Environment Advanced Configuration Snippet for hive-env.sh (Safety Valve) field, enter AUX_CLASSPATH=${AUX_CLASSPATH}:/etc/spark/conf.
- Click Save Changes to commit the changes.
- Click .