解决:hdfs: org.apache.hadoop.security.AccessControlException(Permission denied)

问题描述:我在DataGrip往hive插入一条数据时,出现如下错误

Job Submission failed with exception 'org.apache.hadoop.security.AccessControlException(Permission denied: user=anonymous, access=EXECUTE, inode="/tmp":root:supergroup:drwx------

问题根源:原来不同用户提交的作业是在 /tmp下,他在集群上的权限必须是:700

注:如何更改权限请跳转至:

解决方法:

第一种:在可以更改集群配置文件下:

在hdfs-site中加入,将hdfs的用户认证关闭

    <property>
  <name>dfs.permissions.enabled</name>
  <value>true</value>
  <description>
    If "true", enable permission checking in HDFS.
    If "false", permission checking is turned off,
    but all other behavior is unchanged.
    Switching from one parameter value to the other does not change the mode,
    owner or group of files or directories.
  </description>
</property>
!

      第二种:

        将:export HADOOP_USER_NAME = hdfs 

        添加到 ~/.bash_profile 中,然后执行

        source  ~/.bash_profile 

       第三种:

        将System.setProperty("HADOOP_USER_NAME”,“hdfs”);     

        添加到代码的main方法首部,

        该语句的意思是修改代码的系统变量HADOOP_USER_NAME的内容为hdfs
---------------------

原文地址:https://www.cnblogs.com/GuangMingDingFighter/p/10958989.html