Livy interpreter configuration

You can use the Hive Warehouse Connector in Zeppelin notebooks with the Livy interpreter by modifying or adding properties to your livy interpreter settings. The Livy interpreter accesses processing engines and data sources from the Zeppelin UI.

Requirements

  • Configurations require a livy prefilx.
  • A reference to the HWC jar on the local file system is required.

Interpreter properties

  • livy.spark.hadoop.hive.metastore.uris -

    thrift://<domain name>:<port>

    Example: thrift://hwc-secure-1.hwc-secure.root.hwx.site:9083

  • livy.spark.jars -

    local:/opt/cloudera/parcels/<version>/jars/hive-warehouse-connector-assembly-<version>.jar

    Example: local:/opt/cloudera/parcels/CDH-7.2.1-1.cdh7.2.1.p0.4847773/jars/hive-warehouse-connector-assembly-1.0.0.7.2.1.0-327.jar

    Use the local file ("local:").

  • livy.spark.kryo.registrator - com.qubole.spark.hiveacid.util.HiveAcidKyroRegistrator

  • livy.spark.security.credentials.hiveserver2.enabled- true
  • livy.spark.sql.extensions - com.qubole.spark.hiveacid.HiveAcidAutoConvertExtension
  • livy.spark.sql.hive.hiveserver2.jdbc.url -

    jdbc:hive2://<domain name>:<port>/default;retries=5;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2

    Example: jdbc:hive2://hwc-secure-1.hwc-secure.root.hwx.site:2181/default;retries=5;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2.

  • livy.spark.sql.hive.hiveserver2.jdbc.url.principal - hive/_HOST@ROOT.HWX.SITE