spark安装

参见:https://archive.apache.org/dist/spark/spark-2.3.2/spark-2.3.2-bin-hadoop2.7.tgz

cd /root/lhc

wget  https://archive.apache.org/dist/spark/spark-2.3.2/spark-2.3.2-bin-hadoop2.7.tgz

sudo tar -zxf /root/lhc/spark-2.3.2-bin-hadoop2.7.tgz -C /usr/local/

  1. cd /usr/local

sudo mv ./spark-2.3.2-bin-hadoop2.7/ ./spark

cd spark

  1. sudo chown -R root:root ./spark # 此处的 hadoop 为你的用户名
  1. cd /usr/local/spark
  1. cp ./conf/spark-env.sh.template ./conf/spark-env.sh

vim ./conf/spark-env.sh

原文地址:https://www.cnblogs.com/andylhc/p/9791789.html