Apache spark error: not found: value sqlContext

Since you are using Spark 2.1 you'll have to use the SparkSession object. You can get a reference to SparkContext from the SparkSession object

var sSession = org.apache.spark.sql.SparkSession.getOrCreate();
var sContext = sSession.sparkContext;

Spark context available as 'sc' (master = local[*], app id = local-1490337421381).

Spark session available as 'spark'.

In Spark 2.0.x, the entry point of Spark is SparkSession and that is available in Spark shell as spark, so try this way:

spark.sqlContext.sql(...)

You can also create your Spark Context like this

val sqlContext = new org.apache.spark.sql.SQLContext(sc)

First option is my choice as Spark shell has already created one for you, so make use of it.