Cloudera Docs
»
3.0.3
»
Installing HDF Services on an Existing HDP Cluster
Installing HDF Services on an Existing HDP Cluster
Also available as:
Contents
1. Upgrading Ambari
Preparing to Upgrade
Upgrade Ambari
Mandatory Post-Upgrade Tasks
Upgrading Ambari Infra
Upgrading Ambari Log Search
Upgrading Ambari Metrics
Adding Grafana to Ambari Metrics
Upgrading Configurations
Upgrading Kerberos krb5.conf
Upgrading Log Rotation Configuration
Upgrading SmartSense
2. Upgrading to HDP 2.6.3
Before you begin
Upgrade options
3. Installing Databases
Installing MySQL
Configuring SAM and Schema Registry Metadata Stores in MySQL
Configuring Druid and Superset Metadata Stores in MySQL
4. Installing the HDF Management Pack
5. Update the HDF Base URL
6. Add HDF Services to an HDP Cluster
7. Configure HDF Components
Configure Schema Registry
Configure SAM
Configure NiFi
Configure Kafka
Configure Storm
Deploy the Cluster Services
Access the UI for Deployed Services
8. Configuring Schema Registry and SAM for High Availability
9. Install the Storm Ambari View
10. Using a Local Repository
Obtaining the Repositories
Ambari Repositories
HDP Stack Repositories
HDP 2.6 Repositories
Setting Up a Local Repository
Getting Started Setting Up a Local Repository
Setting Up a Local Repository with No Internet Access
Setting up a Local Repository With Temporary Internet Access
Preparing The Ambari Repository Configuration File
11. Navigating the HDF Library
« Prev
Next »
Obtaining the Repositories
This section describes how to obtain:
Ambari Repositories
HDP Stack Repositories
© 2012–2020, Cloudera, Inc.
Document licensed under the
Creative Commons Attribution ShareAlike 4.0 License
.
Cloudera.com
|
Documentation
|
Support
|
Community