Mapreduce实验总结

1、Caused by: org.apache.hadoop.ipc.RemoteException:

org.apache.hadoop.security.AccessControlException:

Permission denied: user=lenovo, access=WRITE, inode="":suh:supergroup:rwxr-xr-x

解决方案:

修改hadoop的配置文件:hdfs-site.xml

加上:

<property>
<name>dfs.permissions</name>
<value>false</value>
<description>
If "true", enable permission checking in HDFS.
If "false", permission checking is turned off,
but all other behavior is unchanged.
Switching from one parameter value to the other does not change the mode,
owner or group of files or directories.
</description>
</property>


//修改完需要重启下hadoop的进程,分发到其他节点

 2、在进行执行任务,读取文件加入缓存,未找到文件错误

解决:context.getLocalCacheFiles()[0].toUri().getPath();//获取hdfs下的文件位置

3、在实验中注意文件的后缀和文件的字段分割,如果文件有中文的分割,使用字段分割的时候会报错,注意!!

原文地址:https://www.cnblogs.com/zhukaile/p/15582080.html