配置spark集群

配置spark集群

1、配置spark-env.sh

[/soft/spark/conf/spark-env.sh]

...
export JAVA_HOME=/soft/jdk

2、配置slaves文件

[/soft/spark/slaves]

192.168.231.31
192.168.231.40
192.168.231.41

3、分发spark-env.sh文件到其他容器

$>scp /soft/spark/conf/spark-env.sh root@192.168.231.31:/soft/spark/conf
$>scp /soft/spark/conf/spark-env.sh root@192.168.231.40:/soft/spark/conf
$>scp /soft/spark/conf/spark-env.sh root@192.168.231.41:/soft/spark/conf

4、修改spark30容器的主机名

需要修改spark30的主机名,否则master启动时,是将主机名发送给worker节点,worker节点就无法连接到master。

$>hostname 192.168.231.30

5、启动spark集群

$>/soft/spark/sbin/start-all.sh
原文地址:https://www.cnblogs.com/xupccc/p/9838698.html