Non-Ambari Cluster Installation Guide
Also available as:
loading table of contents...

Installing Spark

When you install Spark, two directories will be created:

  • /usr/hdp/current/spark-client for submitting Spark jobs

  • /usr/hdp/current/spark-history for launching Spark master processes, such as the Spark history server

  1. Search for Spark in the HDP repo:

    • For RHEL or CentOS:

      yum search spark

    • For SLES:

      zypper install spark

    • For Ubuntu and Debian:

      apt-cache spark

    This will show all the versions of Spark available. For example,

    spark_2_3_2_0_2950-master.noarch : Server for Spark master
    spark_2_3_2_0_2950-python.noarch : Python client for Spark
    spark_2_3_2_0_2950-worker.noarch : Server for Spark worker
  2. Install the version corresponding to the HDP version you currently have installed.

    • For RHEL or CentOS:

      yum install spark_<version>-master spark_<version>-python

    • For SLES:

      zypper install spark_<version>-master spark_<version>-python

    • For Ubuntu and Debian:

      apt-get install spark_<version>-master apt-get install spark_<version>-python