Non-Ambari Cluster Installation Guide
Also available as:
loading table of contents...

Spark Prerequisites

Before installing Spark, make sure your cluster meets the following prerequisites.

Table 19.1. Prerequisites for running Spark 1.6

Cluster Stack Version
  • HDP 2.4.0 or later

(Optional) Ambari Version
  • 2.2.1 or later

Software dependencies
  • Spark requires HDFS and YARN

  • PySpark requires Python to be installed on all nodes


HDP 2.4.0 supports several versions of Apache Spark. When you install HDP 2.4.0, Spark 1.6 is installed. If you prefer to use an earlier version of Spark, follow the Spark Manual Downgrade procedure in the Release Notes.