Installing Spark Manually
If you want to install Spark or Spark 2 on a cluster that is not managed by Ambari, see Installing and Configuring Apache Spark or Installing and Configuring Apache Spark 2, in the Command Line Installation Guide.
If you previously installed Spark on a cluster not managed by Ambari, and you want to move to Spark 2:
Install Spark 2 according to the Spark 2 instructions in the Command Line Installation Guide.
Ensure that each version of Spark uses a different port.
Test your Spark jobs on Spark 2. To direct a job to Spark 2 when Spark 1 is the default version, see Specifying Which Version of Spark to Run.
When finished testing, optionally remove Spark 1 from the cluster: stop all services and then uninstall Spark. Manually check to make sure all library and configuration directories have been removed.