Command Line Upgrade
Also available as:
loading table of contents...

Configuring and Upgrading Apache Spark

Before you can upgrade Apache Spark, you must have first upgraded your HDP components to the latest version (in this case, 2.6.0). This section assumes that you have already upgraded your components for HDP 2.6.0. If you have not already completed these steps, return to Getting Ready to Upgrade and Upgrade 2.5 Components for instructions on how to upgrade your HDP components to 2.6.0.

The upgrade process installs Spark version 1.6. If you want to use Spark version 2, install version 2 after finishing the HDP 2.6 upgrade process. For more information, see Installing and Configuring Apache Spark 2 in the Command Line Installation Guide.

To upgrade Spark, start the service and update configurations.

  1. Stop the Spark history-server. If you are using the Spark thrift-server, stop the thrift-server.

    su - spark -c "$SPARK_HOME/sbin/"
    su - spark -c "$SPARK_HOME/sbin/"
  2. Remove any reference to hdp.version from the Spark configuration files.

    Remove property from spark-defaults.conf.

    Make sure that spark.history.provider, if present, is set to org.apache.spark.deploy.history.FsHistoryProvider (the default).

  3. Restart the history-server:

    su - spark -c "$SPARK_HOME/sbin/"
  4. If you are using the Spark thrift-server, restart the thrift-server. See (Optional) Starting the Spark Thrift Server.

  5. Validate the Spark installation. As user spark, run the examples in the Running Spark Applications in the Spark Guide.

For additional configuration information, see the Spark Guide.