Command Line Installation
Also available as:
loading table of contents...

Spark Prerequisites

Before installing Spark, make sure your cluster meets the following prerequisites.

Table 19.1. Prerequisites for running Spark 1.6

Cluster Stack Version
  • HDP 2.4.0 or later

(Optional) Ambari Version
  • Ambari 2.2.1 or later

Software dependencies
  • Spark requires HDFS and YARN

  • PySpark requires Python to be installed on all nodes


When you install HDP 2.5.3, Spark 1.6.2 is installed.