Command Line Installation
Also available as:
loading table of contents...

Spark Prerequisites

Before installing Spark, make sure your cluster meets the following prerequisites.

Table 20.1. Prerequisites for running Spark 1.6

Cluster Stack Version
  • HDP 2.4.0 or later

(Optional) Ambari Version
  • Ambari 2.2.1 or later

Software dependencies
  • Spark requires HDFS and YARN

  • PySpark and associated libraries require Python version 2.7 or later, or Python version 3.4 or later, installed on all nodes.


When you install HDP 2.6.0, Spark 1.6.3 is installed.