ubuntu安装spark

1.先得准备环境,需要JAVA环境,还有Python环境(默认会有)

   JAVA下载JDK之后配置JAVA环境变量

    

1 export JAVA_HOME=/opt/jdk1.8.0_45
2 
3 export JRE_HOME=${JAVA_HOME}/jre
4 
5 export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib
6 
7 export PATH=${JAVA_HOME}/bin:${JRE_HOME}/bin:$PATH      

2.下载spark,配置变量

1 export SPARK_HOME=/opt/spark-hadoop/

重启,测试,搞定;

参考http://www.open-open.com/lib/view/open1432192407317.html

原文地址:https://www.cnblogs.com/wswang/p/4970550.html