The folks at Hadoop in Real World explain the difference between SparkSession, SparkContext, SQLContext, and HiveContext:
SQLContext is your gateway to SparkSQL. Here is how you create a SQLContext using the SparkContext.
// sc is an existing SparkContext.
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
Once you have the SQLContext you can start working with DataFrame, DataSet etc.
Knowing the right entry point is important.