Spark启动与停止

启动顺序

Hadoop启动

start-dfs.sh

start-yarn.sh

Spark启动

start-all.sh

停止顺序

Spark停止

stop-all.sh

Hadoop停止

stop-yarn.sh

stop-dfs.sh

原文地址:https://www.cnblogs.com/kingshine007/p/8082659.html