Providing the Hive password through a file

You can save the Hive password in a file and then set up Sqoop to use this password for Sqoop-Hive import processes when LDAP authentication is enabled.

  1. Create a file containing the appropriate Hive password. Ensure to set '400' permission on the file so that only the user or owner of the file has read permissions.
    You can save this file either in a local file system or on HDFS.
  2. While creating the Sqoop import command, specify the --hive-password-file argument along with the path of the password file you created in the previous step.
    /opt/cloudera/parcels/CDH/bin/sqoop import \
      -Dsqoop.beeline.env.preserve=KRB5CCNAME \
      --connection-manager org.apache.sqoop.manager.PostgresqlManager \
      --connect "jdbc:postgresql://db.foo.com:5432/employees" \
      --username [***USERNAME***] \
      --password [***PASSWORD***] \
      --table employees \
      --warehouse-dir \
      /user/hrt_qa/test-sqoop \
      --hive-import \
      --delete-target-dir \
      --hive-overwrite \
      --external-table-dir hdfs:///warehouse/tablespace/external/hive/employees \
      --hs2-url "jdbc:hive2://[***HOST***]:[***PORT***];serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;transportMode=http;httpPath=cliservice;ssl=true;sslTrustStore=[***TRUSTSTORE PATH***];trustStorePassword=[***TRUSTSTORE PASSWORD***]" \
      --hive-user guest \
      --hive-password-file /user/hrt_qa/hivepasswd-storefile \
      -m 1