Configuring Your Cluster for Workflow Manager View
For Workflow Manager View to access HDFS, the Ambari Server daemon hosting the view must act as the proxy user for HDFS. This allows Ambari to submit requests to HDFS on behalf of the Worklow Manager View users.
Each Worklow Manager View user must have a user directory set up in HDFS.
If the cluster is configured for Kerberos, ensure that the the section called “Configuring Views for Kerberos” has been completed.
Set up HDFS Proxy User
Note | |
---|---|
If you previously set up the proxy user for another View, you can skip this task. |
To set up an HDFS proxy user for the Ambari Server daemon account, you need to configure the proxy user in the HDFS configuration. This configuration is determined by the account name the ambari-server daemon is running as. For example, if your ambari-server is running as root, you set up an HDFS proxy user for root.
Steps
In Ambari Web, browse to Services > HDFS > Configs.
Under the Advanced tab, navigate to the Custom core-site section.
Click Add Property… to add the following custom properties:
hadoop.proxyuser.root.groups="users" hadoop.proxyuser.root.hosts=ambari-server.hostname
Notice the
ambari-server
daemon account name root is part of the property name. Be sure to modify this property name for the account name you are running the ambari-server as. For example, if you were runningambari-server
daemon under an account name ofambariusr
, you would use the following properties instead:hadoop.proxyuser.ambariusr.groups="users" hadoop.proxyuser.ambariusr.hosts=ambari-server.hostname
Similarly, if you have configured Ambari Server for Kerberos, be sure to modify this property name for the primary Kerberos principal user. For example, if ambari-server is setup for Kerberos using principal ambari-server@EXAMPLE.COM, you would use the following properties instead:
hadoop.proxyuser.ambari-server.groups="users" hadoop.proxyuser.ambari-server.hosts=ambari-server.hostname
Save the configuration change and restart the required components as indicated by Ambari.
More Information
Set up HDFS User Directory
You must set up a directory for each user that accesses the Workflow Manager View. Workflow Manager View stores user metadata in the user directory in HDFS.
About This Task
By default, the location in HDFS for the user directory is
/user/${username}
, where ${username}
is the
username of the currently logged in user that is accessing Workflow Manager View.
Important | |
---|---|
Since many users leverage the default Ambari admin user for
getting started with Ambari, you should create a |
Steps
Connect to a host in the cluster that includes the HDFS client.
Switch to the HDFS system account user.
su - hdfs
If working on a secure Kerberos cluster:
Destroy any existing Kerberos tickets:
kdestroy
If no ticket is found, you get an error message that you can ignore: No credentials cache file found while destroying cache
Obtain a Kerberos ticket-granting ticket:
kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs
Using the HDFS client, make an HDFS directory for the user.
For example, if your username is wfmadmin, you would create the following directory.
hadoop fs -mkdir /user/wfmadmin
Set the ownership on the newly created directory.
For example, if your username is wfmadmin, the directory owner should be wfmadmin:hadoop.
hadoop fs -chown wfmadmin:hadoop /user/wfmadmin
Log in as root user.
su -
Access the Kerberos administration system.
kadmin.local
Create a new principal and password for the user.
You can use the same user name and password that you used for HDFS directory.
addprinc -pw [
wfmadmin-password
] wfmadmin@EXAMPLE.COMRepeat steps 2 through 8 for any additional Workflow Manager View users.