Apache Spark Component Guide
Also available as:
PDF
loading table of contents...

Installing Spark Manually

If you want to install Spark or Spark 2 on a cluster that is not managed by Ambari, see Installing and Configuring Apache Spark or Installing and Configuring Apache Spark 2, in the Command Line Installation Guide.

If you previously installed Spark on a cluster not managed by Ambari, and you want to move to Spark 2:

  1. Install Spark 2 according to the Spark 2 instructions in the Command Line Installation Guide.

  2. Ensure that each version of Spark uses a different port.

  3. Test your Spark jobs on Spark 2. To direct a job to Spark 2 when Spark 1 is the default version, see Specifying Which Version of Spark to Run.

  4. When finished testing, optionally remove Spark 1 from the cluster: stop all services and then uninstall Spark. Manually check to make sure all library and configuration directories have been removed.