安装spark问题汇总

使用的版本是 spark-1.6.3-bin-without-hadoop

运行spark-shell报错

运行spark-sql报错找不到org.datanucleus.api.jdo.JDOPersistenceManagerFactory

把$HIVE_HOME/lib/datanucleus*.jar复制到$SPARK_HOME/lib目录下

运行spark-sql报错找不到org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver

从官网下载spark-1.6.3-bin-hadoop2.6版本,把$SPARK_HOME/lib/spark-assembly-1.6.3-hadoop2.6.0.jar覆盖lib/spark-assembly-1.6.3-hadoop2.2.0

原文地址:https://www.cnblogs.com/lanhj/p/7294496.html