Upgrading HDP Manually
Copyright © 2012-2015 Hortonworks, Inc.
Except where otherwise noted, this document is licensed under Creative Commons Attribution ShareAlike 3.0 License |
Hortonworks Data Platform (HDP) and any of its components are not anticipated to be combined with any hardware, software or data, except as expressly recommended in this documentation.
2015-06-09
Abstract
The Hortonworks Data Platform, powered by Apache Hadoop, is a massively scalable and 100% open source platform for storing, processing and analyzing large volumes of data. It is designed to deal with data from many sources and formats in a very quick, easy and cost-effective manner. The Hortonworks Data Platform consists of the essential set of Apache Hadoop projects including MapReduce, Hadoop Distributed File System (HDFS), HCatalog, Pig, Hive, HBase, Zookeeper and Ambari. Hortonworks is the major contributor of code and patches to many of these projects. These projects have been integrated and tested as part of the Hortonworks Data Platform release process and installation and configuration tools have also been included.
Unlike other providers of platforms built using Apache Hadoop, Hortonworks contributes 100% of our code back to the Apache Software Foundation. The Hortonworks Data Platform is Apache-licensed and completely open source. We sell only expert technical support, training and partner-enablement services. All of our technology is, and will remain, free and open source.
Please visit the Hortonworks Data Platform page for more information on Hortonworks technology. For more information on Hortonworks services, please visit either the Support or Training page. Feel free to contact us directly to discuss your specific needs.
Contents
- 1. Upgrade from HDP 2.1 to HDP 2.2 Manually
- 1. Getting Ready to Upgrade
- 2. Upgrade HDP 2.1 Components
- 3. Symlink Directories with hdp-select
- 4. Configure and Start Apache ZooKeeper
- 5. Configure Hadoop
- 6. Start Hadoop Core
- 7. Verify HDFS filesystem health
- 8. Configure YARN and MapReduce
- 9. Start YARN/MapReduce Services
- 10. Run Hadoop Smoke Tests
- 11. Configure and Start Apache HBase
- 12. Configure Apache Phoenix
- 13. Configure and Start Apache Accumulo
- 14. Configure and Start Apache Tez
- 15. Configure and Start Apache Hive and Apache HCatalog
- 16. Configure and Start Apache Oozie
- 17. Configure and Start Apache WebHCat
- 18. Configure Apache Pig
- 19. Configure and Start Apache Sqoop
- 20. Configure, Start, and Validate Apache Flume
- 21. Configure, Start, and Validate Apache Mahout
- 22. Configure and Start Hue
- 23. Configure and Start Apache Knox
- 24. Configure and Validate Apache Falcon
- 25. Configure and Start Apache Storm
- 26. Upgrade Apache Ranger
- 27. Finalize the Upgrade
- 28. Install New HDP 2.2 Services
- 2. Upgrade from HDP 2.0 to HDP 2.2 Manually
- 1. Getting Ready to Upgrade
- 2. Upgrade HDP 2.0 Components
- 3. Symlink Directories with hdp-select
- 4. Configure and Start Apache ZooKeeper
- 5. Configure Hadoop
- 6. Start Hadoop Core
- 7. Verify HDFS Filesystem Health
- 8. Configure YARN and MapReduce
- 9. Start YARN/MapReduce Services
- 10. Run Hadoop Smoke Tests
- 11. Configure and Start Apache HBase
- 12. Configure and Start Apache Hive and Apache HCatalog
- 13. Configure and Start Apache Oozie
- 14. Configure and Start Apache WebHCat (Templeton)
- 15. Configure and Start Apache Pig
- 16. Configure and Start Apache Sqoop
- 17. Configure, Start, and Validate Apache Flume
- 18. Configure, Start, and Validate Apache Mahout
- 19. Configure and Start Hue
- 20. Finalize the Upgrade
- 21. Install New HDP 2.2 Services
- 3. Upgrade from HDP 1.3 to HDP 2.2 Manually
- 1. Getting Ready to Upgrade
- 2. Upgrade HDP 1.3 Components
- 3. Symlink Directories with hdp-select
- 4. Configure and Start Apache ZooKeeper
- 5. Configure and Start Hadoop
- 6. Migrate the HDP Configurations
- 7. Create Local Directories
- 8. Start Hadoop Core
- 9. Verify HDFS filesystem health
- 10. Configure YARN and MapReduce
- 11. Start YARN/MapReduce Services
- 12. Run Hadoop Smoke Tests
- 13. Configure and Start Apache HBase
- 14. Configure and Start Apache Hive and Apache HCatalog
- 15. Configure and Start Apache Oozie
- 16. Configure and Start Apache WebHCat (Templeton)
- 17. Configure and Start Apache Pig
- 18. Configure and Start Apache Sqoop
- 19. Configure, Start, and Validate Apache Flume
- 20. Configure, Start, and Validate Apache Mahout
- 21. Configure and Start Hue
- 22. Finalize the Upgrade
- 23. Install New HDP 2.2 Services
List of Tables
- 1.1. Hive Metastore Database Backup and Restore
- 1.2. Oozie Metastore Database Backup and Restore
- 1.3. Hue Database Backup and Restore
- 2.1. Hive Metastore Database Backup and Restore
- 2.2. Oozie Metastore Database Backup and Restore
- 2.3. Hue Database Backup and Restore
- 3.1. Hive Metastore Database Backup and Restore
- 3.2. Oozie Metastore Database Backup and Restore
- 3.3. Hue Database Backup and Restore
- 3.4. HDP 1.3.2 Hadoop Core Site (core-site.xml)
- 3.5. HDP 1.3.2 Hadoop Core Site (hdfs-site.xml)
- 3.6. HDP 1.3.2 Configs now in Capacity Scheduler for HDP 2.x (mapred-site.xml)
- 3.7. HDP 1.3.2 Configs now in capacity scheduler for HDP 2.x (capacity-scheduler.xml)
- 3.8. HDP 1.3.2 Configs and HDP 2.x for hadoop-env.sh