Kerberos configurations for HWC
You learn how to set up HWC for Kerberos, or not. You set properties and values depending on the cluster or client JDBC mode you configure.
In CDP Private Cloud Base, you need to configure execution mode by
setting HWC properties in configuration/spark-defaults.conf
, depending on the
execution mode you select. Alternatively, you can set the properties using the
spark-submit/spark-shell --conf
option.
Secured cluster configuration
For Spark applications on a kerberized Yarn cluster, set the following property:
spark.sql.hive.hiveserver2.jdbc.url.principal
. This property must be
equal to hive.server2.authentication.kerberos.principal
.
In Spark cluster mode on a kerberized YARN cluster, use Spark ServiceCredentialProvider and
set the following property:
- JDBC Cluster Mode in a secured cluster
- Property:
spark.security.credentials.hiveserver2.enabled
- Value:
true
- Comment:
true
by default
- Property:
- JDBC Client Mode in a secured cluster
- Property:
spark.security.credentials.hiveserver2.enabled
- Value:
false
- Property:
Unsecured cluster configuration
In an unsecured cluster, set the following property:
- Property:
spark.security.credentials.hiveserver2.enabled
- Value:
false