Non-Ambari Cluster Installation Guide
Also available as:
loading table of contents...

Spark Prerequisites

Before installing Spark, make sure your cluster meets the following prerequisites:

Table 19.1. Spark Cluster Prerequisites



Cluster Stack Version

  • HDP 2.2.6 or later

(Optional) Ambari Version

  • 2.1 or later


  • Spark requires HDFS and YARN

  • PySpark requires Python to be installed on all nodes


If you installed the Spark tech preview, save any configuration changes you made to the tech preview environment. Install Spark, and then update the configuration with your changes.