HiveContext VS SQLContext

There are two ways to create context in Spark SQL:

SqlContext:

scala> import org.apache.spark.sql._
scala> var sqlContext = new SQLContext(sc)

HiveContext:

scala> import org.apache.spark.sql.hive._
scala> val hc = new HiveContext(sc)

Though most of the code examples you see use SqlContext, you should always use HiveContext. HiveContext is a superset of SqlContext, so it can do what SQLContext can do and much more. You do not have to connect to Hive to use HiveContext.

原文地址:https://www.cnblogs.com/longjshz/p/5287887.html