Starting Hive using a password

You start the Hive shell using a Beeline command to query Hive as an end user authorized by Apache Ranger. As administrator, you set up the end user in the operating system and in Ranger.

Before starting Hive for the first time, check that you are covered by Ranger policies required for basic operations as shown in the following steps. All users need to use the default database, perform operations such as listing database names, and query the information schema. The preloaded default database tables columns and information_schema database Ranger policies cover group public (all users). If these policies are disabled, you cannot use the default database, perform basic operations such as listing database names, or query the information schema. For example, if the default database tables columns policy is disabled, the following error appears if you try to use the default database:

hive> USE default;
Error: Error while compiling statement: FAILED: HiveAccessControlException 
Permission denied: user [hive] does not have [USE] privilege on [default]
  1. Access the Ranger Console: click the Ranger Admin web UI link, enter your user name and password, then click Sign In.
  2. On the far right, click Ranger > Hadoop SQL , and in Allow Conditions, edit all - database, table, column.
  3. Add your user or group name to Hive policies to grant full access to Hive.
    For example, add the admins group name to the list of groups that can access Hive.
  4. Check that the preloaded default database tables columns and information_schema database policies are enabled for group public.
  5. In Environments > your environment > your cluster, click the CM-URL.
  6. In Cloudera Manager, click Hosts > All Hosts.
  7. Make a note of the IP address or host name of a node in your cluster, for example myhost-vpc.cloudera.com.
  8. Use ssh to log into the cluster.
    For example:
    ssh myhost-vpc.cloudera.com
    You can get help about starting the Hive shell. On the command line, type
    hive -h
    Output is:
    Connect using simple authentication to HiveServer2 on localhost:10000
    beeline -u jdbc:hive2://localhost:10000 username password
                        
    Connect using simple authentication to HiveServer2 on hs.local:10000 using -n for username and -p for password
    beeline -n username -p password -u jdbc:hive2://hs2.local:10012
                        
    Connect using Kerberos authentication with hive/localhost@mydomain.com as HiveServer2 principal
    beeline -u "jdbc:hive2://hs2.local:10013/default;principal=hive/localhost@mydomain.com"
                        
    Connect using SSL connection to HiveServer2 on localhost at 10000
    beeline "jdbc:hive2://localhost:10000/default;ssl=true;sslTrustStore=/usr/local/truststore;trustStorePassword=mytruststorepassword"
                        
    Connect using LDAP authentication
    beeline -u jdbc:hive2://hs2.local:10013/default <ldap-username> <ldap-password>
  9. Using the fully qualified domain name or IP address a node in your cluster from Cloudera Manager > Hosts > Role(s) and looking through the list of roles to find HS2 (Hive on Tez HiveServer2). For example:
    This node has the HiveServer role, so you can use the name or IP address in Beeline.
  10. Start the Hive shell.
    • Use your user name if your cluster security is set up.
    • Use the user name hive and no password.
    Substitute the name or IP address of your HiveServer host for 10.65.13.98.
    Simple authentiation:
    beeline -u jdbc:hive2://10.65.13.98:10000 -n <your user name> -p
    Kerberos:
    beeline -u "jdbc:hive2://10.65.13.98:10000/default;principal=hive/_HOST@CLOUDERA.SITE"
  11. Enter Hive queries.
    SHOW DATABASES;
    CREATE TABLE students (name VARCHAR(64), age INT, gpa DECIMAL(3,2));
    INSERT INTO TABLE students VALUES ('fred flintstone', 35, 1.28), ('barney rubble', 32, 2.32);