Installing HDF Services on an Existing HDP Cluster
Copyright © 2012-2017 Hortonworks, Inc.
Except where otherwise noted, this document is licensed under Creative Commons Attribution ShareAlike 4.0 License |
2017-12-22
Abstract
Hortonworks DataFlow (HDF) is powered by Apache NiFi. A version of this documentation originally appeared on the Apache NiFi website.
HDF is the first integrated platform that solves the real time challenges of collecting and transporting data from a multitude of sources and provides interactive command and control of live flows with full and automated data provenance. HDF is a single combined platform that provides the data acquisition, simple event processing, transport and delivery mechanism designed to accommodate the diverse dataflows generated by a world of connected people, systems and things.
Unlike other providers of platforms built using Apache Hadoop, Hortonworks contributes 100% of our code back to the Apache Software Foundation. Hortonworks DataFlow is Apache-licensed and completely open source. We sell only expert technical support, training and partner-enablement services. All of our technology is, and will remain free and open source.
Please visit the Hortonworks page for more information on Hortonworks technology. For more information on Hortonworks services, please visit either the Support or Training page. Feel free to Contact Us directly to discuss your specific needs.
Contents
- 1. Upgrading Ambari
- 2. Upgrading to HDP 2.6.3
- 3. Installing Databases
- 4. Installing the HDF Management Pack
- 5. Update the HDF Base URL
- 6. Add HDF Services to an HDP Cluster
- 7. Configure HDF Components
- 8. Configuring Schema Registry and SAM for High Availability
- 9. Install the Storm Ambari View
- 10. Using a Local Repository
- 11. Navigating the HDF Library