Learn how you can use an alias to represent the Hive password during Sqoop-Hive
import processes when LDAP authentication is enabled.
The Hive password is stored in a Credential Provider facility and is associated with
the alias. During the import, Sqoop resolves the alias and uses the linked
password.
-
Using the CredentialProvider API, store the Hive password in a user specified
provider path and associate it with an alias.
/opt/cloudera/parcels/CDH/lib/hadoop/bin/hadoop credential \
create sqoophive.password.alias \
-value guest-password \
-provider jceks://hdfs/user/hive/sqoophivepasswd.jceks
-
Add the provider path property in the Sqoop import command pointing to the
credential provider URI that should be considered while resolving the credential
alias.
-D hadoop.security.credential.provider.path=<***PROVIDER
PATH***>
-D hadoop.security.credential.provider.path=jceks://hdfs/user/hive/sqoophivepasswd.jceks \
-
While creating the Sqoop import command, specify the
--hive-password-alias
argument with the alias name that you
want to resolve.
/opt/cloudera/parcels/CDH/bin/sqoop import \
-Dsqoop.beeline.env.preserve=KRB5CCNAME \
-D hadoop.security.credential.provider.path=jceks://hdfs/user/hive/sqoophivepasswd.jceks \
--connection-manager org.apache.sqoop.manager.PostgresqlManager \
--connect "jdbc:postgresql://db.foo.com:5432/employees" \
--username [***USERNAME***] \
--password [***PASSWORD***] \
--table employees \
--warehouse-dir \
/user/hrt_qa/test-sqoop \
--hive-import \
--delete-target-dir \
--hive-overwrite \
--external-table-dir hdfs:///warehouse/tablespace/external/hive/employees \
--hs2-url "jdbc:hive2://[***HOST***]:[***PORT***];serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;transportMode=http;httpPath=cliservice;ssl=true;sslTrustStore=[***TRUSTSTORE PATH***];trustStorePassword=[***TRUSTSTORE PASSWORD***]" \
--hive-user guest \
--hive-password-alias sqoophive.password.alias \
-m 1