Configuring DataNode SASL
Use the following steps to configure DataNode SASL to securely run a DataNode as a non-root user:
1. Shut Down the DataNode
Shut down the DataNode using the applicable commands in the "Controlling HDP Services Manually" section of HDP Reference Guide.
2. Enable SASL
Configure the following properties in the
/etc/hadoop/conf/hdfs-site.xml
file to enable DataNode SASL.
The dfs.data.transfer.protection
property enables DataNode
SASL. You can set this property to one of the following values:
authentication
-- Establishes mutual authentication between the client and the server.integrity
-- in addition to authentication, it guarantees that a man-in-the-middle cannot tamper with messages exchanged between the client and the server.privacy
-- in addition to the features offered by authentication and integrity, it also fully encrypts the messages exchanged between the client and the server.
In addition to setting a value for the
dfs.data.transfer.protection
property, you must set the
dfs.http.policy
property to HTTPS_ONLY
.
You must also specify ports for the DataNode RPC and HTTP Servers.
Note | |
---|---|
For more information on configuring SSL, see the sections "Creating and Managing SSL Certificates" and "Enabling SSL for HDP Components" in Hadoop Security Guide. |
For example:
<property> <name>dfs.data.transfer.protection</name> <value>integrity</value> </property> <property> <name>dfs.datanode.address</name> <value>0.0.0.0:10019</value> </property> <property> <name>dfs.datanode.http.address</name> <value>0.0.0.0:10022</value> </property> <property> <name>dfs.http.policy</name> <value>HTTPS_ONLY</value> </property>
Note | |
---|---|
If you are already using the following encryption setting:
This is similar to:
These two settings are mutually exclusive, so you should not have both of them set. However, if both are set, |
3. Update Environment Settings
Edit the following setting in the /etc/hadoop/conf/hadoop-env.sh
file, as shown below:
#On secure datanodes, user to run the datanode as after dropping privileges export HADOOP_SECURE_DN_USER=
The export HADOOP_SECURE_DN_USER=hdfs
line enables the legacy
security configuration, and must be set to an empty value in order for SASL to be
enabled.
4. Start the DataNode
Start the DataNode services using the applicable commands in the "Controlling HDP Services Manually" section of HDP Reference Guide.