Hortonworks Data Platform (HDP) is an open source distribution powered by Apache Hadoop. HDP provides you with the actual Apache-released versions of the components with all the necessary bug fixes to make all the components interoperable in your production environments. It is packaged with an easy to use installer (HDP Installer) that deploys the complete Apache Hadoop stack to your entire cluster and provides the necessary monitoring capabilities using Ganglia and Nagios. The HDP distribution consists of the following components:
Core Hadoop platform (Hadoop HDFS and Hadoop MapReduce)
Non-relational database (Apache HBase)
Metadata services (Apache HCatalog)
Scripting platform (Apache Pig)
Data access and query (Apache Hive)
Workflow scheduler (Apache Oozie)
Cluster coordination (Apache Zookeeper)
Management and monitoring (Apache Ambari)
Data integration services (HCatalog APIs, WebHDFS, Talend Open Studio for Big Data, and Apache Sqoop)
Distributed log management services (Apache Flume)
Machine learning library (Mahout)
To learn more about the distribution details and the component versions, see the Release Notes. All components are official Apache releases of the most recent stable versions available. Hortonworks’ philosophy is to do patches only when absolutely necessary to assure interoperability of the components. Consequently, there are very few patches in the HDP, and they are all fully documented. Each of the HDP components have been tested rigorously prior to the actual Apache release. To learn more about the testing strategy adopted at Hortonworks, Inc., see: Delivering high-quality Apache Hadoop releases.