Upgrading HDP Manually
Also available as:
PDF
loading table of contents...

Configure Hadoop

Overwrite original configuration files with new ones:

cp -r $YOUR_BACKUP_FOLDER/ /etc/hadoop/conf

Where $YOUR_BACKUP_FOLDER is the directory where you saved your configuration files.

RHEL/CentOS/Oracle Linux

  1. Use the HDP Utility script to calculate memory configuration settings. You must update the memory/cpu settings in yarn-site.xml and mapred-site.xml.

  2. Paths have changed in HDP 2.3. Make sure you remove old path specifications from hadoop-env.sh, such as:

    export JAVA_LIBRARY_PATH=/usr/lib/hadoop/lib/native/Linux-amd64-64

    If you leave these paths in your hadoop-env.sh file, the lzo compression code will not load, as this is not where lzo is installed.

SLES

  1. Use the HDP Utility script to calculate memory configuration settings. You must update the memory/cpu settings in yarn-site.xml and mapred-site.xml.

  2. Paths have changed since HDP 2.3. Make sure you remove old path specifications from hadoop-env.sh, such as:

    export JAVA_LIBRARY_PATH=/usr/lib/hadoop/lib/native/Linux-amd64-64

    If you leave these paths in your hadoop-env.sh file, the lzo compression code will not load, as this is not where lzo is installed.

Ubuntu/Debian

HDP support for Debian 6 is deprecated with HDP 2.4.3. Future versions of HDP will no longer be supported on Debian 6.

  1. Use the HDP Utility script to calculate memory configuration settings. You must update the memory/cpu settings in yarn-site.xml and mapred-site.xml

  2. Paths have changed in HDP 2.4.3. Make sure you remove old path specifications from hadoop-env.sh, such as:

    export JAVA_LIBRARY_PATH=/usr/lib/hadoop/lib/native/Linux-amd64-64

    If you leave these paths in your hadoop-env.sh file, the lzo compression code will not load, as this is not where lzo is installed.