Optional: Spark Manual Upgrade Procedure
(Optional) Upgrade Spark from 1.6.1+ to 1.6.2:
As the root user, stop Spark 1.6.0:
su - spark -c "/usr/hdp/current/spark-client/sbin/stop-history-server.sh"
.Remove Spark 1.6.1+:
yum erase "spark*"
.Add the node where you want Spark 1.6.2 History Server to run:
su - root
wget -nv http://s3.amazonaws.com/dev.hortonworks.com/HDP/centos6/2.x/BUILDS/2.4.3.0-227/hdpbn.repo -O /etc/yum.repos.d/Spark141TP.repo
yum install spark_2_4_3_0_227-master -y
To use Python:
yum install spark_2_4_3_0_227-python
conf-select create-conf-dir --package spark --stack-version 2.4.3.0-227 --conf-version 0
cp /etc/spark/2.4.3.0-227/0/* /etc/spark/2.4.3.0-227/0/
conf-select set-conf-dir --package spark --stack-version 2.4.3.0-227 --conf-version 0
hdp-select set spark-client 2.4.3.0-227
hdp-select set spark-historyserver 2.4.3.0-227
Validate the Spark installation by running SparkPI as the spark user, as in the following example:
su - spark -c "cd /usr/hdp/current/spark-client"
./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn-client --num-executors 3 --driver-memory 512m --executor-memory 512m --executor-cores 1 lib/spark-examples*.jar 10
Restart Spark on YARN in either yarn-cluster mode or yarn-client mode:
yarn-cluster mode:
./usr/hdp/current/spark-client/bin/spark-submit --class path.to.your.Class --master yarn-cluster [options] <app jar> [app options]
yarn-client mode:
./usr/hdp/current/spark-client/bin/spark-shell --master yarn-client