spark运行时加载配置文件(hive,hdfs)

文章为转载,如有版权问题,请联系,谢谢!

转自:https://blog.csdn.net/piduzi/article/details/81636253

适合场景:在运行时才确定用哪个数据源


import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.fs.Path
import org.apache.spark.sql.SparkSession
import scala.collection.JavaConverters._


object ReadHive {
def main(args: Array[String]): Unit = {
val sparkBuilder = SparkSession
.builder
.master("local")
.appName("Spk Pi")
val conf = new Configuration()
// 这里的文件地址可以换成从数据库里查询
val core = new Path("C:\Users\shadow\Desktop\core-site.xml")
val hdfs = new Path("C:\Users\shadow\Desktop\hdfs-site.xml")
val hive = new Path("C:\Users\shadow\Desktop\hive-site.xml")
conf.addResource(core)
conf.addResource(hdfs)
conf.addResource(hive)
for (c <- conf.iterator().asScala){
sparkBuilder.config(c.getKey, c.getValue)
}
val spark = sparkBuilder.enableHiveSupport().getOrCreate()
spark.sql("select * from default.wt_test1").show() }
}

 
原文地址:https://www.cnblogs.com/zuizui1204/p/9772541.html