java.io.IOException: Could not locate executable nullinwinutils.exe in the Hadoop binaries.

问题说明:

一般来说,我编写Spark,MapReduce程序都是会在本地IDEA中进行,开发。本地跑通了,才会把代码放到集群上去跑。

当我在运行一个简单的Spark Job 的时候,控制台出现如下的错误:

java.io.IOException: Could not locate executable nullinwinutils.exe in the Hadoop binaries.

问题解决:

一看不能加载,我就感觉我的机器(Windows 10)可能是因为缺少这个winutils的文件。
我把错误放到google上一搜,方案真的多
我就找了一篇比较不错的,然后完美解决 我的问题。
具体如下:

  1. Download winutils.exe from http://public-repo-1.hortonworks.com/hdp-win-alpha/winutils.exe.
    Also, be sure to download the correct winutils.exe based on the version of hadoop that spark is compiled for (so, not necessarily the link above)

  2. create a folder c:hadoopin

  3. add Download winutils.exe to the c:hadoopin.

  4. Set environment variable HADOOP_HOME to C:hadoop.warning!set HADOOP_HOME to hadoop folder instead of the bin folder.
    OR,System.setProperty(“hadoop.home.dir”, “C:hadoop”) in your code.

参照:https://stackoverflow.com/questions/35652665/java-io-ioexception-could-not-locate-executable-null-bin-winutils-exe-in-the-ha

原文地址:https://www.cnblogs.com/liuge36/p/12614759.html