Hortonworks Docs
»
DataFlow 3.5.1
»
Installing HDF Services on a New HDP Cluster
Installing HDF Services on a New HDP Cluster
Also available as:
Installing Ambari
Installing Databases
Supported Databases with NiFi Registry
Installing MySQL
Configuring SAM and Schema Registry Metadata Stores in MySQL
Configuring Druid and Superset Metadata Stores in MySQL
Configuring NiFi Registry Metadata Stores in MySQL
Install Postgres
Configure Postgres to Allow Remote Connections
Configure SAM and Schema Registry Metadata Stores in Postgres
Configure Druid and Superset Metadata Stores in Postgres
Configuring NiFi Registry Metadata Stores in Postgres
Specifying an Oracle Database to Use with SAM and Schema Registry
Switching to an Oracle Database After Installation
Deploying an HDP Cluster Using Ambari
Installing an HDP Cluster
Customize Druid Services
Configure Superset
Deploy the Cluster Services using Ambari
Access the Stream Insight Superset UI
Installing the HDF Management Pack
Update the HDF Base URL
Add HDF Services to an HDP Cluster
Configure HDF Components
Configure Schema Registry
Configure SAM
Configuring SAM log search and event sampling
Configure NiFi
Configure NiFi for Atlas Integration
Configure Kafka
Configure Storm
Configure Log Search
Deploy the Cluster Services
Access the UI for Deployed Services
Configuring Schema Registry and SAM for High Availability
Configuring SAM for High Availability
Configuring Schema Registry for High Availability
Installing Ambari
The first step in installing your HDF cluster is installing Ambari.
Apache Ambari Installation
© 2012–2020, Cloudera, Inc.
Document licensed under the
Creative Commons Attribution ShareAlike 4.0 License
.
Cloudera.com
|
Documentation
|
Support
|
Community