hive爬坑

今天在执行insert语句的时候,发现hive报错

ERROR : Job Submission failed with exception 'org.apache.hadoop.security.AccessControlException(Permission denied: user=hive, access=EXECUTE, inode="/tmp":root:supergroup:drwx------

问题根源:原来不同用户提交的作业是在 /tmp下,他在集群上的权限必须是:700

解决方法0:

hadoop fs -chmod -R 777 /tmp

如果报错

chmod: changing permissions of '/tmp': Permission denied. user=hive is not the owner of inode=/tmp

则执行 sudo -u hive hadoop fs -chmod -R 777 /tmp

第一种:在可以更改集群配置文件下:

在hdfs-site中加入,将hdfs的用户认证关闭

 <property>
  <name>dfs.permissions.enabled</name>
  <value>true</value>
  <description>
    If "true", enable permission checking in HDFS.
    If "false", permission checking is turned off,
    but all other behavior is unchanged.
    Switching from one parameter value to the other does not change the mode,
    owner or group of files or directories.
  </description>
</property>

 第二种:

        将:export HADOOP_USER_NAME = hdfs 

        添加到 ~/.bash_profile 中,然后执行

        source  ~/.bash_profile 

       第三种:

        将System.setProperty("HADOOP_USER_NAME”,“hdfs”);     

        添加到代码的main方法首部,

        该语句的意思是修改代码的系统变量HADOOP_USER_NAME的内容为hdfs

author@nohert
原文地址:https://www.cnblogs.com/gzgBlog/p/14990660.html