scala程序启动时,Could not locate executable nullinwinutils.exe in the Hadoop binaries解决方案

原因:window本地无法获取hadoop的配置

解决方法:

1)下载一个spark-2.4.6-bin-hadoop2.7压缩包到本地并解压

2)下载一个winutils.exe放到spark-2.4.6-bin-hadoop2.7in

3)在scala程序中添加如下配置即可:

复制代码
def enterHbase(): Unit = {
    System.setProperty("hadoop.home.dir", "E:\testscala\spark-2.4.6-bin-hadoop2.7")
    Logger.getLogger("org").setLevel(Level.WARN)
    //------------------hbase连接部分-----------------------
    val conf = HBaseConfiguration.create()
    conf.set("hbase.zookeeper.quorum", zkIP)
    conf.set("hbase.zookeeper.property.clientPort", "2181")
    conf.set("zookeeper.znode.parent", "/hbase")
复制代码
原文地址:https://www.cnblogs.com/30go/p/13652480.html