Spark任务定时执行

编写start.sh

#/bin/bash

spark-submit --master yarn 
--name IotIoState5minJob 
--executor-memory 1G 
--executor-cores 1 
--num-executors 3  
--driver-memory 1G 
--conf "spark.ui.port=45053" 
--conf "spark.driver.extraJavaOptions=-XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+PrintGCDetails -verbose:gc -XX:+PrintGCTimeStamps" 
--class com.cserver.job.IotIoState5minJob /app/lib/spark-demo-1.0.jar 

编写调度任务

vi /etc/crontab


SHELL=/bin/bash
PATH=/sbin:/bin:/usr/sbin:/usr/bin
MAILTO=root

# For details see man 4 crontabs

# Example of job definition:
# .---------------- minute (0 - 59)
# |  .------------- hour (0 - 23)
# |  |  .---------- day of month (1 - 31)
# |  |  |  .------- month (1 - 12) OR jan,feb,mar,apr ...
# |  |  |  |  .---- day of week (0 - 6) (Sunday=0 or 7) OR sun,mon,tue,wed,thu,fri,sat
# |  |  |  |  |
# *  *  *  *  * user-name  command to be executed
*/5 * * * * root su - hdfs -c /app/start.sh >> /app/spark.log
原文地址:https://www.cnblogs.com/lianglianggege/p/9373807.html