Prerequisites for adding classic clusters
Make sure you verify the version requirements of your classic clusters and install or configure all of the classic cluster requirements before you try to add or register them to CDP.
Check the version compatibility
Make sure the following version requirements are met.
- For adding HDP clusters:
- HDP - HDP 18.104.22.168000 plus any required patch version
- DLM Engine version - 22.214.171.124-x
- Ambari - 2.7.3 or 126.96.36.199 or 2.6.2
- For adding CDH clusters
- CDH - 5.x and 6.x
- Cloudera Manager - 5.x and 6.x
- For more information about support for databases, operating systems, and processors, see the Cloudera Support Matrix.
Verify the required roles are assigned
- Make sure the user can log in as the admin user to Ambari in the HDP/HDF cluster.
- Make sure the user can log in as the admin user to Cloudera Manager in the CDH cluster and HDP cluster.
Install the required components
- Make sure Metrics Server is installed in Cloudera Manager in the CDH clusters.
- Make sure Ambari Metrics is installed in Ambari in the HDP/HDF clusters.
- Make sure Ambari is configured with LDAP.
- Make sure Knox is installed and the default topology is configured with LDAP.
- Make sure there is at least one topology in the Knox setup with the same LDAP as Ambari.
- If there are policies restricting access through Knox, make sure Ranger policies allow communication through Knox.
- Make sure cluster is configured with Kerberos.
- Make sure the user credential used for registering the classic cluster is a valid LDAP user with an admin role in Ambari.
Open ports for CCM
If you would like to use Cluster Connectivity Manager (CCM), you should ensure that outgoing traffic is allowed in the port range 6000-6049 on the legacy CDH/HDP cluster. For more information about CCM, refer to Cluster Connectivity Manager documentation.