Non-Ambari Cluster Installation Guide
Also available as:
loading table of contents...

Spark Prerequisites

Before installing Spark, make sure your cluster meets the following prerequisites.

Table 19.1. Prerequisites for running Spark 1.4.1

Cluster Stack Version
  • HDP 2.3.2 or later

(Optional) Ambari Version
  • 2.1 or later

Software dependencies
  • Spark requires HDFS and YARN

  • PySpark requires Python to be installed on all nodes


HDP 2.3.2 supports Spark 1.3.1 and Spark 1.4.1. When you install HDP 2.3.2, Spark 1.4.1 is installed. If you prefer to use Spark 1.3.1, follow the Spark Manual Downgrade procedure in the HDP 2.3.2 release notes.

If you installed the tech preview, save any configuration changes you made to the tech preview environment. Install Spark, and then update the configuration with your changes.