New Spark entry point SparkSession
There is a new Spark API entry point: SparkSession.
Type of change
Syntactic/Spark core
Spark 1.6
Hive Context and SQLContext, such as import SparkContext, HiveContext
are
supported.
Spark 2.4
SparkSession is now the entry point.
Action Required
Replace the old SQLContext and HiveContext with SparkSession. For example:
import org.apache.spark.sql.SparkSession
val spark = SparkSession
.builder()
.appName("Spark SQL basic example")
.config("spark.some.config.option", "some-value")
.getOrCreate()
.