spark默认集群模式,省略上传依赖包过程:spark-default.sh
spark.yarn.jars hdfs:///${PATH}/sparkJar/jars/*.jar
spark.submit.deployMode cluster
hive限定本地JVM内存大小:hive-default.xml
<property> <name>hive.mapred.local.mem</name> <!--<value>10230 M K B</value>--> <!--<value>default Byte</value>--> <value>10230000000</value> <description>mapper/reducer memory in local mode</description> </property>