Configuring Authentication
The purpose of authentication in Hadoop, as in other systems, is simply to prove that a user or service is who he or she claims to be.
Typically, authentication in enterprises is managed through a single distributed system, such as a Lightweight Directory Access Protocol (LDAP) directory. LDAP authentication consists of straightforward username/password services backed by a variety of storage systems, ranging from file to database.
A common enterprise-grade authentication system is Kerberos. Kerberos provides strong security benefits including capabilities that render intercepted authentication packets unusable by an attacker. It virtually eliminates the threat of impersonation by never sending a user's credentials in cleartext over the network.
Several components of the Hadoop ecosystem are converging to use Kerberos authentication with the option to manage and store credentials in LDAP or AD. For example, Microsoft's Active Directory (AD) is an LDAP directory that also provides Kerberos authentication for added security.
Continue reading:
- Configuring Authentication in Cloudera Manager
- Configuring Authentication in the Cloudera Navigator Data Management Component
- Configuring Authentication in CDH Using the Command Line
- Flume Authentication
- HBase Authentication
- HCatalog Authentication
- Hive Authentication
- HttpFS Authentication
- Hue Authentication
- Impala Authentication
- Llama Authentication
- Oozie Authentication
- Solr Authentication
- Spark Authentication
- Sqoop 2 Authentication
- ZooKeeper Authentication
- Hadoop Users in Cloudera Manager and CDH
- Configuring a Cluster-dedicated MIT KDC with Cross-Realm Trust
- Integrating Hadoop Security with Active Directory
- Integrating Hadoop Security with Alternate Authentication
- Authenticating Kerberos Principals in Java Code
- Using a Web Browser to Access an URL Protected by Kerberos HTTP SPNEGO
- Troubleshooting Kerberos Issues
- Troubleshooting Authentication Issues