ubuntu下安装spark1.4.0

构建在hadoop2.6.0之上的

1.在官网下载spark-1.4.0-bin-hadoop2.6.tgz

2.解压到你想要放的文件夹里,tar zxvf spark-1.4.0-bin-hadoop2.6.tgz

3.配置profile

  sudo gedit /etc/profile

 在文件下面加入一下路径配置,保存退出,并用source /etc/profile 使环境生效

export SPARK_HOME=/home/jiahong/spark-1.4.0-bin-hadoop2.6
export PATH=$SPARK_HOME/bin:$PATH

4.启动spark

jiahong@jiahong-OptiPlex-7010:~$ cd spark-1.4.0-bin-hadoop2.6/
jiahong@jiahong-OptiPlex-7010:~/spark-1.4.0-bin-hadoop2.6$ sbin/start-all.sh 
starting org.apache.spark.deploy.master.Master, logging to /home/jiahong/spark-1.4.0-bin-hadoop2.6/sbin/../logs/spark-jiahong-org.apache.spark.deploy.master.Master-1-jiahong-OptiPlex-7010.out
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /home/jiahong/spark-1.4.0-bin-hadoop2.6/sbin/../logs/spark-jiahong-org.apache.spark.deploy.worker.Worker-1-jiahong-OptiPlex-7010.ou

5.访问spark界面

 localhost:8080

原文地址:https://www.cnblogs.com/aijianiula/p/4592985.html